Oct 06 08:41:06 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 08:41:07 crc restorecon[4566]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:07 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:41:08 crc restorecon[4566]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 08:41:08 crc kubenswrapper[4610]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.778391 4610 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783639 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783669 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783679 4610 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783688 4610 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783697 4610 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783708 4610 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783718 4610 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783727 4610 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783735 4610 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783743 4610 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783751 4610 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783759 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783767 4610 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783774 4610 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783782 4610 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783789 4610 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783797 4610 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783805 4610 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783812 4610 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783820 4610 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783827 4610 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783837 4610 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783848 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783857 4610 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783866 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783881 4610 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783890 4610 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783898 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783906 4610 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783914 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783921 4610 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783929 4610 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783936 4610 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783946 4610 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783954 4610 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783961 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783971 4610 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783980 4610 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783988 4610 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.783998 4610 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784006 4610 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784015 4610 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784024 4610 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784032 4610 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784040 4610 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784084 4610 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784092 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784099 4610 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784108 4610 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784116 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784124 4610 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784132 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784141 4610 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784149 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784182 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784194 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784204 4610 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784212 4610 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784219 4610 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784227 4610 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784236 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784244 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784251 4610 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784259 4610 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784266 4610 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784273 4610 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784284 4610 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784294 4610 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784303 4610 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784314 4610 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.784322 4610 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785866 4610 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785887 4610 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785904 4610 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785914 4610 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785926 4610 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785935 4610 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785947 4610 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785964 4610 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785974 4610 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785982 4610 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.785992 4610 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786002 4610 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786011 4610 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786020 4610 flags.go:64] FLAG: --cgroup-root="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786028 4610 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786037 4610 flags.go:64] FLAG: --client-ca-file="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786077 4610 flags.go:64] FLAG: --cloud-config="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786086 4610 flags.go:64] FLAG: --cloud-provider="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786095 4610 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786106 4610 flags.go:64] FLAG: --cluster-domain="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786117 4610 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786127 4610 flags.go:64] FLAG: --config-dir="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786137 4610 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786147 4610 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786158 4610 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786168 4610 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786177 4610 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786186 4610 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786195 4610 flags.go:64] FLAG: --contention-profiling="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786204 4610 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786213 4610 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786222 4610 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786231 4610 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786243 4610 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786252 4610 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786261 4610 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786270 4610 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786279 4610 flags.go:64] FLAG: --enable-server="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786287 4610 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786300 4610 flags.go:64] FLAG: --event-burst="100" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786309 4610 flags.go:64] FLAG: --event-qps="50" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786318 4610 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786328 4610 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786337 4610 flags.go:64] FLAG: --eviction-hard="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786348 4610 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786356 4610 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786367 4610 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786379 4610 flags.go:64] FLAG: --eviction-soft="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786400 4610 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786417 4610 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786430 4610 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786441 4610 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786454 4610 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786465 4610 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786476 4610 flags.go:64] FLAG: --feature-gates="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786490 4610 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786499 4610 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786509 4610 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786520 4610 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786530 4610 flags.go:64] FLAG: --healthz-port="10248" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786539 4610 flags.go:64] FLAG: --help="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786548 4610 flags.go:64] FLAG: --hostname-override="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786557 4610 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786566 4610 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786577 4610 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786586 4610 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786595 4610 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786604 4610 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786613 4610 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786622 4610 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786630 4610 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786639 4610 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786649 4610 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786658 4610 flags.go:64] FLAG: --kube-reserved="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786666 4610 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786675 4610 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786684 4610 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786692 4610 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786701 4610 flags.go:64] FLAG: --lock-file="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786712 4610 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786721 4610 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786730 4610 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786743 4610 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786752 4610 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786762 4610 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786771 4610 flags.go:64] FLAG: --logging-format="text" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786780 4610 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786789 4610 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786798 4610 flags.go:64] FLAG: --manifest-url="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786807 4610 flags.go:64] FLAG: --manifest-url-header="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786819 4610 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786827 4610 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786838 4610 flags.go:64] FLAG: --max-pods="110" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786847 4610 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786856 4610 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786864 4610 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786874 4610 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786882 4610 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786891 4610 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786901 4610 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786919 4610 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786928 4610 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786936 4610 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786945 4610 flags.go:64] FLAG: --pod-cidr="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786954 4610 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786966 4610 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786975 4610 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786984 4610 flags.go:64] FLAG: --pods-per-core="0" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.786992 4610 flags.go:64] FLAG: --port="10250" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787001 4610 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787010 4610 flags.go:64] FLAG: --provider-id="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787019 4610 flags.go:64] FLAG: --qos-reserved="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787027 4610 flags.go:64] FLAG: --read-only-port="10255" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787036 4610 flags.go:64] FLAG: --register-node="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787080 4610 flags.go:64] FLAG: --register-schedulable="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787091 4610 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787106 4610 flags.go:64] FLAG: --registry-burst="10" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787115 4610 flags.go:64] FLAG: --registry-qps="5" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787124 4610 flags.go:64] FLAG: --reserved-cpus="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787132 4610 flags.go:64] FLAG: --reserved-memory="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787144 4610 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787153 4610 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787162 4610 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787170 4610 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787179 4610 flags.go:64] FLAG: --runonce="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787187 4610 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787197 4610 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787206 4610 flags.go:64] FLAG: --seccomp-default="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787214 4610 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787223 4610 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787233 4610 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787242 4610 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787251 4610 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787260 4610 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787269 4610 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787278 4610 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787286 4610 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787295 4610 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787304 4610 flags.go:64] FLAG: --system-cgroups="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787313 4610 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787328 4610 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787338 4610 flags.go:64] FLAG: --tls-cert-file="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.787347 4610 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789266 4610 flags.go:64] FLAG: --tls-min-version="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789281 4610 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789291 4610 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789301 4610 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789310 4610 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789322 4610 flags.go:64] FLAG: --v="2" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789335 4610 flags.go:64] FLAG: --version="false" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789347 4610 flags.go:64] FLAG: --vmodule="" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789361 4610 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.789370 4610 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789585 4610 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789781 4610 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789790 4610 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789798 4610 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789809 4610 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789818 4610 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789826 4610 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789834 4610 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789842 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789851 4610 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789858 4610 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789866 4610 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789873 4610 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789882 4610 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789890 4610 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789897 4610 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789908 4610 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789920 4610 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789930 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789939 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789948 4610 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789956 4610 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789965 4610 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789973 4610 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789982 4610 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789990 4610 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.789997 4610 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790005 4610 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790012 4610 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790020 4610 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790028 4610 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790036 4610 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790073 4610 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790082 4610 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790091 4610 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790099 4610 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790107 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790115 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790123 4610 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790130 4610 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790139 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790147 4610 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790155 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790163 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790171 4610 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790179 4610 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790187 4610 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790197 4610 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790207 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790216 4610 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790224 4610 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790233 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790242 4610 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790250 4610 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790260 4610 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790268 4610 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790278 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790287 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790295 4610 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790305 4610 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790315 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790323 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790331 4610 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790339 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790349 4610 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790359 4610 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790367 4610 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790376 4610 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790384 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790392 4610 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.790400 4610 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.790424 4610 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.807450 4610 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.807504 4610 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807634 4610 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807647 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807657 4610 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807667 4610 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807676 4610 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807686 4610 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807694 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807703 4610 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807711 4610 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807720 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807729 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807738 4610 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807747 4610 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807757 4610 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807769 4610 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807779 4610 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807787 4610 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807796 4610 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807805 4610 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807815 4610 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807823 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807831 4610 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807839 4610 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807848 4610 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807855 4610 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807863 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807871 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807879 4610 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807888 4610 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807896 4610 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807905 4610 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807913 4610 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807922 4610 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807930 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807941 4610 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807952 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807962 4610 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807970 4610 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807982 4610 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.807991 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808001 4610 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808010 4610 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808018 4610 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808027 4610 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808035 4610 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808086 4610 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808108 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808120 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808130 4610 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808142 4610 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808151 4610 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808166 4610 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808180 4610 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808192 4610 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808204 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808217 4610 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808227 4610 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808238 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808249 4610 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808259 4610 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808269 4610 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808282 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808292 4610 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808303 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808314 4610 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808325 4610 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808334 4610 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808345 4610 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808355 4610 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808405 4610 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808417 4610 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.808437 4610 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808671 4610 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808684 4610 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808694 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808704 4610 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808713 4610 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808721 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808729 4610 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808738 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808746 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808754 4610 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808762 4610 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808769 4610 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808778 4610 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808786 4610 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808794 4610 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808801 4610 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808809 4610 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808817 4610 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808825 4610 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808833 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808841 4610 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808849 4610 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808857 4610 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808865 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808872 4610 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808880 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808888 4610 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808895 4610 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808905 4610 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808915 4610 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808925 4610 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808935 4610 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808945 4610 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808955 4610 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808966 4610 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808976 4610 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.808989 4610 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809026 4610 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809040 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809093 4610 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809109 4610 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809120 4610 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809138 4610 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809151 4610 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809162 4610 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809171 4610 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809179 4610 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809187 4610 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809197 4610 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809205 4610 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809213 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809222 4610 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809231 4610 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809238 4610 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809247 4610 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809254 4610 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809262 4610 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809269 4610 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809277 4610 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809285 4610 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809293 4610 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809303 4610 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809313 4610 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809322 4610 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809330 4610 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809338 4610 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809346 4610 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809354 4610 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809362 4610 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809369 4610 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.809378 4610 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.809392 4610 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.811994 4610 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.817785 4610 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.817911 4610 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.819366 4610 server.go:997] "Starting client certificate rotation" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.819403 4610 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.820354 4610 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 00:08:28.282755749 +0000 UTC Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.820454 4610 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1935h27m19.462305866s for next certificate rotation Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.853180 4610 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.857135 4610 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.883202 4610 log.go:25] "Validated CRI v1 runtime API" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.927024 4610 log.go:25] "Validated CRI v1 image API" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.930781 4610 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.942598 4610 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-08-35-44-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.942649 4610 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.955177 4610 manager.go:217] Machine: {Timestamp:2025-10-06 08:41:08.953100436 +0000 UTC m=+0.668153844 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a268cadd-0c3c-491c-869f-df56a4b697a6 BootID:ca67adee-388a-4a79-b348-5f88a51a6438 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:54:74:e1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:54:74:e1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:69:c0 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:59:ce:6f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c1:3a:1f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e5:aa:1c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:36:50:46:00:3f:17 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:c1:ff:69:30:40 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.955386 4610 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.955665 4610 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.957907 4610 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.958122 4610 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.958161 4610 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.958373 4610 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.958383 4610 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.959136 4610 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.959167 4610 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.959453 4610 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.959543 4610 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.962740 4610 kubelet.go:418] "Attempting to sync node with API server" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.962761 4610 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.962782 4610 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.962795 4610 kubelet.go:324] "Adding apiserver pod source" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.962805 4610 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.971310 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:08 crc kubenswrapper[4610]: E1006 08:41:08.971425 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:08 crc kubenswrapper[4610]: W1006 08:41:08.971315 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:08 crc kubenswrapper[4610]: E1006 08:41:08.971481 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.977547 4610 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.978921 4610 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.980393 4610 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981767 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981794 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981802 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981809 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981821 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981853 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981863 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981877 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981887 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981897 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981932 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.981942 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.984878 4610 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.985403 4610 server.go:1280] "Started kubelet" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.986827 4610 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.986833 4610 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 08:41:08 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.994018 4610 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.994564 4610 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.994513 4610 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.996012 4610 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 21:46:21.503256266 +0000 UTC Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.996076 4610 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 949h5m12.507184275s for next certificate rotation Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.997704 4610 server.go:460] "Adding debug handlers to kubelet server" Oct 06 08:41:08 crc kubenswrapper[4610]: I1006 08:41:08.998871 4610 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.000225 4610 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.000240 4610 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.000323 4610 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:08.999619 4610 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:08.999870 4610 factory.go:55] Registering systemd factory Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.000717 4610 factory.go:221] Registration of the systemd container factory successfully Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.000782 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.000835 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.001125 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.21:6443: connect: connection refused" interval="200ms" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.001236 4610 factory.go:153] Registering CRI-O factory Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.001324 4610 factory.go:221] Registration of the crio container factory successfully Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.001384 4610 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.001420 4610 factory.go:103] Registering Raw factory Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.001441 4610 manager.go:1196] Started watching for new ooms in manager Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.002089 4610 manager.go:319] Starting recovery of all containers Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012332 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012519 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012533 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012547 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012558 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012569 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012581 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012594 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012608 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012621 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012634 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012647 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012660 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012677 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012691 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012704 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012715 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012726 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012738 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012752 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012764 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012777 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012789 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012804 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012818 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012831 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012847 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012860 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012872 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012885 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012897 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012909 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012921 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012932 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012943 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012956 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012967 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012980 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.012992 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013005 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013045 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013071 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013106 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013119 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013132 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013144 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013155 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013204 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013216 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013229 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013241 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013251 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013267 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013280 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013292 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013303 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013316 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013333 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013345 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013358 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013370 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013384 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013396 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013407 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013419 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013433 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013444 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013455 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013465 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013476 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013487 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013497 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013515 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013527 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013539 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013550 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013561 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013576 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013587 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013604 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013615 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013626 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013638 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013648 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013661 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013672 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013686 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013698 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013710 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013721 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013734 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013748 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013758 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013770 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013782 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013795 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013808 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013824 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013842 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013855 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013868 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013880 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013896 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013910 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013928 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013942 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013954 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.009392 4610 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.21:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bda44ece500d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 08:41:08.985372885 +0000 UTC m=+0.700426273,LastTimestamp:2025-10-06 08:41:08.985372885 +0000 UTC m=+0.700426273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.013967 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014211 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014287 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014337 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014361 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014385 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014423 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014443 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014488 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014513 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014530 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014582 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014598 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014613 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014673 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014692 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014707 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014743 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014763 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014778 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.014984 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015006 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015020 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015036 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015077 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015093 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015109 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015172 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015187 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015205 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015251 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015271 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015320 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015333 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015350 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015392 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.015412 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.019904 4610 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.019974 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.020002 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022870 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022905 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022925 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022943 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022961 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022981 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.022999 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023017 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023036 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023097 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023119 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023139 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023158 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023175 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023192 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023209 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023227 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023245 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023262 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023283 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023299 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023315 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023333 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023349 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023366 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023382 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023400 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023417 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023434 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023458 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023487 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023504 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023523 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023542 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023560 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023575 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023649 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023673 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023688 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023706 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023722 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023736 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023753 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023769 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023784 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023798 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023816 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023832 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023848 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023863 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023880 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023897 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023912 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023927 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023942 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023957 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023974 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.023988 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.024005 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.024023 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.024042 4610 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.024071 4610 reconstruct.go:97] "Volume reconstruction finished" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.024081 4610 reconciler.go:26] "Reconciler: start to sync state" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.028241 4610 manager.go:324] Recovery completed Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.036223 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.037579 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.037612 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.037624 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.038442 4610 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.038462 4610 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.038479 4610 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.065193 4610 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.066201 4610 policy_none.go:49] "None policy: Start" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.067759 4610 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.067783 4610 state_mem.go:35] "Initializing new in-memory state store" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.069082 4610 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.069162 4610 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.069208 4610 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.069314 4610 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.069990 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.070103 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.101136 4610 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.129301 4610 manager.go:334] "Starting Device Plugin manager" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.129362 4610 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.129377 4610 server.go:79] "Starting device plugin registration server" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.129943 4610 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.129962 4610 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.130515 4610 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.130621 4610 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.130630 4610 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.141761 4610 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.170124 4610 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.170359 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.171804 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.171830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.171854 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.171962 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.172402 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.172458 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.172774 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.172817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.172836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.173013 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.173225 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.173307 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174597 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174605 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174691 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174702 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174710 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174839 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.174868 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176055 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176095 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176093 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176241 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176319 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176439 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176470 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176592 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176620 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.176632 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177258 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177277 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177288 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177526 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177545 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177553 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177690 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.177715 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.178279 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.178297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.178305 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.202518 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.21:6443: connect: connection refused" interval="400ms" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.225831 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.225929 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226109 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226199 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226333 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226430 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226465 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226529 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226607 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226702 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226737 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226802 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226883 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.226915 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.227001 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.234897 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.235959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.236017 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.236028 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.236071 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.236528 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.21:6443: connect: connection refused" node="crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328345 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328431 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328553 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328582 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328611 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328639 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328665 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328690 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328944 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328974 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.328981 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329018 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329095 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329000 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329144 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329027 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329194 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329223 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329233 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329285 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329111 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329340 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329281 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329305 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329411 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329275 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329435 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329568 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329560 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.329679 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.437522 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.438977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.439021 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.439032 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.439085 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.439689 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.21:6443: connect: connection refused" node="crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.506671 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.514017 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.528914 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.546844 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.551212 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.604147 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.21:6443: connect: connection refused" interval="800ms" Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.644620 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-395d5e414cbad90e901d53695cde94dc14d46605e4c9566187d21cdd8cab4f0e WatchSource:0}: Error finding container 395d5e414cbad90e901d53695cde94dc14d46605e4c9566187d21cdd8cab4f0e: Status 404 returned error can't find the container with id 395d5e414cbad90e901d53695cde94dc14d46605e4c9566187d21cdd8cab4f0e Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.645531 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4ee34816b14be8caf4d36a9e267669cc5e0f5c4cf45d6520cac16fe1f23057c2 WatchSource:0}: Error finding container 4ee34816b14be8caf4d36a9e267669cc5e0f5c4cf45d6520cac16fe1f23057c2: Status 404 returned error can't find the container with id 4ee34816b14be8caf4d36a9e267669cc5e0f5c4cf45d6520cac16fe1f23057c2 Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.646574 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ee28efba5f8a9e6462e14334ba29df8ac7feac1d35e8cf3ce7331dc06add7759 WatchSource:0}: Error finding container ee28efba5f8a9e6462e14334ba29df8ac7feac1d35e8cf3ce7331dc06add7759: Status 404 returned error can't find the container with id ee28efba5f8a9e6462e14334ba29df8ac7feac1d35e8cf3ce7331dc06add7759 Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.647423 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c708128f869d721d6e48677167c3cbe7bffb413e0a0a307549a03ab546061c0a WatchSource:0}: Error finding container c708128f869d721d6e48677167c3cbe7bffb413e0a0a307549a03ab546061c0a: Status 404 returned error can't find the container with id c708128f869d721d6e48677167c3cbe7bffb413e0a0a307549a03ab546061c0a Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.647973 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-66ae137a763192376feb13ad56a3241c9712e2b86e94e4fbde624e75d193611f WatchSource:0}: Error finding container 66ae137a763192376feb13ad56a3241c9712e2b86e94e4fbde624e75d193611f: Status 404 returned error can't find the container with id 66ae137a763192376feb13ad56a3241c9712e2b86e94e4fbde624e75d193611f Oct 06 08:41:09 crc kubenswrapper[4610]: W1006 08:41:09.802607 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.803146 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.840556 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.843081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.843122 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.843131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:09 crc kubenswrapper[4610]: I1006 08:41:09.843157 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:09 crc kubenswrapper[4610]: E1006 08:41:09.843521 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.21:6443: connect: connection refused" node="crc" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.002006 4610 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.074564 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"66ae137a763192376feb13ad56a3241c9712e2b86e94e4fbde624e75d193611f"} Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.075885 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ee34816b14be8caf4d36a9e267669cc5e0f5c4cf45d6520cac16fe1f23057c2"} Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.077009 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ee28efba5f8a9e6462e14334ba29df8ac7feac1d35e8cf3ce7331dc06add7759"} Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.078098 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"395d5e414cbad90e901d53695cde94dc14d46605e4c9566187d21cdd8cab4f0e"} Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.079003 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c708128f869d721d6e48677167c3cbe7bffb413e0a0a307549a03ab546061c0a"} Oct 06 08:41:10 crc kubenswrapper[4610]: W1006 08:41:10.236272 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:10 crc kubenswrapper[4610]: E1006 08:41:10.236406 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:10 crc kubenswrapper[4610]: W1006 08:41:10.291132 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:10 crc kubenswrapper[4610]: E1006 08:41:10.291234 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:10 crc kubenswrapper[4610]: E1006 08:41:10.405438 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.21:6443: connect: connection refused" interval="1.6s" Oct 06 08:41:10 crc kubenswrapper[4610]: W1006 08:41:10.567624 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:10 crc kubenswrapper[4610]: E1006 08:41:10.567741 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.644280 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.645658 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.645696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.645707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:10 crc kubenswrapper[4610]: I1006 08:41:10.645731 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:10 crc kubenswrapper[4610]: E1006 08:41:10.646190 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.21:6443: connect: connection refused" node="crc" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.001881 4610 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.082344 4610 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.082400 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae"} Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.082457 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.083634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.083660 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.083673 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.085399 4610 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.085492 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b"} Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.085762 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.086954 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.086977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.086989 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.091474 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.093102 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.093155 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.093167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.093301 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9"} Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.095623 4610 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.095777 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.096233 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd"} Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.097031 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.097082 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.097094 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.099704 4610 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.099748 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34"} Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.099854 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.101832 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.101915 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:11 crc kubenswrapper[4610]: I1006 08:41:11.101969 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.002191 4610 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.006467 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.21:6443: connect: connection refused" interval="3.2s" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.107151 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.107217 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.107237 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.107355 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.108500 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.108546 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.108555 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.109534 4610 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c" exitCode=0 Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.109644 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.109688 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.110392 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.110424 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.110434 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.112560 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.112613 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.112627 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.112641 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.125711 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.125802 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.125815 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.125844 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.127021 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.127073 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.127087 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.128868 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275"} Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.128989 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.129918 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.129947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.129959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.247288 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.250259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.250304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.250319 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:12 crc kubenswrapper[4610]: I1006 08:41:12.250345 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.250808 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.21:6443: connect: connection refused" node="crc" Oct 06 08:41:12 crc kubenswrapper[4610]: W1006 08:41:12.290552 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.290643 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:12 crc kubenswrapper[4610]: W1006 08:41:12.436564 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.436651 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:12 crc kubenswrapper[4610]: W1006 08:41:12.453772 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.453940 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:12 crc kubenswrapper[4610]: W1006 08:41:12.705200 4610 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:12 crc kubenswrapper[4610]: E1006 08:41:12.705304 4610 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.21:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.001920 4610 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.21:6443: connect: connection refused Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.141416 4610 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76" exitCode=0 Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.141543 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76"} Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.141761 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.143300 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.143350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.143363 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.148741 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b"} Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.148808 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.148846 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.148919 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.149022 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.149061 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150482 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150516 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150529 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150775 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.150800 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151166 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151208 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151172 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151249 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:13 crc kubenswrapper[4610]: I1006 08:41:13.151262 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.157331 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5"} Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.157385 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907"} Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.157400 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40"} Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.157414 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d"} Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.159645 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.162316 4610 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b" exitCode=255 Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.162352 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b"} Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.162438 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.162476 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163598 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163682 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163698 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163851 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.163865 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.164435 4610 scope.go:117] "RemoveContainer" containerID="ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b" Oct 06 08:41:14 crc kubenswrapper[4610]: I1006 08:41:14.762311 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.171223 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d"} Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.171458 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.173106 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.173145 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.173160 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.174878 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.177020 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728"} Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.177159 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.177202 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.177973 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.178013 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.178030 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.234140 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.451545 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.453375 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.453477 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.453516 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:15 crc kubenswrapper[4610]: I1006 08:41:15.453570 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.179665 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.179739 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.180656 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.180936 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.180957 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.180964 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.181689 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.181727 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.181742 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:16 crc kubenswrapper[4610]: I1006 08:41:16.834873 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.100009 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.182526 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.182584 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.184313 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.184375 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.184399 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.185136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.185270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.185352 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.276514 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.276718 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.277925 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.278073 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:17 crc kubenswrapper[4610]: I1006 08:41:17.278157 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.075772 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.085457 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.116829 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.185095 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.185185 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.185681 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186101 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186149 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186811 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:18 crc kubenswrapper[4610]: I1006 08:41:18.186838 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:19 crc kubenswrapper[4610]: E1006 08:41:19.142067 4610 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.187837 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.187852 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189119 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189179 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189221 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189246 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:19 crc kubenswrapper[4610]: I1006 08:41:19.189259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.885466 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.885733 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.887497 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.887572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.887591 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:21 crc kubenswrapper[4610]: I1006 08:41:21.893181 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.147918 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.148137 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.149297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.149347 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.149359 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.197243 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.198807 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.198857 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:22 crc kubenswrapper[4610]: I1006 08:41:22.198870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:23 crc kubenswrapper[4610]: I1006 08:41:23.848799 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 08:41:23 crc kubenswrapper[4610]: I1006 08:41:23.848890 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 08:41:23 crc kubenswrapper[4610]: I1006 08:41:23.853299 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 08:41:23 crc kubenswrapper[4610]: I1006 08:41:23.853727 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 08:41:24 crc kubenswrapper[4610]: I1006 08:41:24.763602 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:41:24 crc kubenswrapper[4610]: I1006 08:41:24.763726 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:41:24 crc kubenswrapper[4610]: I1006 08:41:24.886319 4610 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:41:24 crc kubenswrapper[4610]: I1006 08:41:24.886432 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.840833 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.841023 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.841466 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.841572 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.842268 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.842338 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.842364 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:26 crc kubenswrapper[4610]: I1006 08:41:26.845653 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.100924 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.101009 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.208684 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.209356 4610 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.209442 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.209696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.209766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:27 crc kubenswrapper[4610]: I1006 08:41:27.209785 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:28 crc kubenswrapper[4610]: E1006 08:41:28.853588 4610 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.855380 4610 trace.go:236] Trace[834608651]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:41:17.309) (total time: 11545ms): Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[834608651]: ---"Objects listed" error: 11545ms (08:41:28.855) Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[834608651]: [11.545334341s] [11.545334341s] END Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.855439 4610 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 08:41:28 crc kubenswrapper[4610]: E1006 08:41:28.857858 4610 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.860152 4610 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.860306 4610 trace.go:236] Trace[846954203]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:41:17.148) (total time: 11711ms): Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[846954203]: ---"Objects listed" error: 11711ms (08:41:28.860) Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[846954203]: [11.711381161s] [11.711381161s] END Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.860336 4610 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.862225 4610 trace.go:236] Trace[891641054]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:41:16.923) (total time: 11939ms): Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[891641054]: ---"Objects listed" error: 11938ms (08:41:28.861) Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[891641054]: [11.939143132s] [11.939143132s] END Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.862259 4610 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.864732 4610 trace.go:236] Trace[1606055099]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:41:17.737) (total time: 11127ms): Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[1606055099]: ---"Objects listed" error: 11127ms (08:41:28.864) Oct 06 08:41:28 crc kubenswrapper[4610]: Trace[1606055099]: [11.127477968s] [11.127477968s] END Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.865100 4610 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.975876 4610 apiserver.go:52] "Watching apiserver" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.980193 4610 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.980422 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.980840 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:28 crc kubenswrapper[4610]: E1006 08:41:28.980904 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.980943 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.981173 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.981336 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.981611 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.981637 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:28 crc kubenswrapper[4610]: E1006 08:41:28.981701 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:28 crc kubenswrapper[4610]: E1006 08:41:28.981434 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.988923 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.988953 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.988980 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.989226 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.989825 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.990017 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.996153 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.996243 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 08:41:28 crc kubenswrapper[4610]: I1006 08:41:28.996399 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.001643 4610 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.029802 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.045427 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.056284 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.061653 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.061913 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062034 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062185 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062278 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062364 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062460 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062568 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062687 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062772 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062867 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062961 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063057 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062213 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.062971 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063123 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063163 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063426 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063620 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063174 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063682 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063708 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063726 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063729 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063747 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063842 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063902 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063936 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063968 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063995 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064019 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064099 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064140 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064169 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064196 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064221 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064245 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064269 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064326 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064353 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064381 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064407 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064437 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064472 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064501 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064527 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064551 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064577 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064629 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064676 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064704 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064730 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064756 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064780 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064804 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064831 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064858 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064884 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064915 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064957 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064984 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065012 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.063904 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064061 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064430 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064783 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.064930 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065124 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065346 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065350 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065520 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.065857 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066013 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066037 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066117 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066258 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066370 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066402 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066437 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066512 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066635 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066703 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066668 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066754 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066941 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066989 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.067026 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.067243 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.066242 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.067599 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.067624 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.067997 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068478 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068502 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068100 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068216 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068556 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068584 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068611 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068652 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068678 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068708 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068735 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068762 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068794 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068820 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068919 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.068948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069003 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069032 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069079 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069107 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069133 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069159 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069186 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069212 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069239 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069269 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069297 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069323 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069364 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069391 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069417 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069444 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069471 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069495 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069521 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069546 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069574 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069600 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069630 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069655 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069681 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069706 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069731 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069753 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069777 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069800 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069821 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069842 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069870 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069891 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069912 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069935 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069957 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069978 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070000 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070026 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070089 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070115 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070137 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070160 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070184 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070209 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070234 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070255 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070279 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070301 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070324 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070345 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070366 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070387 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070406 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070429 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070455 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070477 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070496 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070516 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070536 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070560 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070582 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070605 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070629 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070651 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070674 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070699 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070721 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070745 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070768 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070790 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070824 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070848 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070870 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070892 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070914 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070937 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070962 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.070986 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071010 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071033 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071073 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071094 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071116 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071136 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071157 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071178 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071202 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071225 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071248 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071273 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071298 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071325 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071349 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071374 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071401 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071422 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071447 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071470 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071499 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071523 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071548 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071575 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071600 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071640 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071670 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071698 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071726 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071752 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071777 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071805 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071833 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071877 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071903 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071927 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.071981 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.072006 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.072033 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073058 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073089 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073119 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073145 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073174 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073202 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073226 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073252 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073319 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073353 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073386 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073419 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073449 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073484 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073516 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073544 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073567 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073594 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073621 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073645 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073670 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073696 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073807 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073825 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073841 4610 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073855 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073868 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073888 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073904 4610 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073917 4610 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073930 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073942 4610 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073955 4610 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073970 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073983 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073995 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074008 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074021 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074036 4610 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074107 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074121 4610 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074133 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074146 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074161 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074174 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074189 4610 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074206 4610 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074221 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074248 4610 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074264 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074277 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074292 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074306 4610 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074321 4610 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074335 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074349 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074363 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074376 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074389 4610 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074403 4610 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074416 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074429 4610 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.076898 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078021 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.069263 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.072116 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.072358 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073142 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.073602 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.080421 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074641 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.074744 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.075095 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.075214 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.075774 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:41:29.575624008 +0000 UTC m=+21.290677396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.075754 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.076145 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.077263 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.077365 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.077499 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.077753 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078061 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078115 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078627 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078697 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078862 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078883 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.078912 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.079234 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.079764 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.080082 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.080091 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.080198 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.080374 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.081249 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.081353 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.081511 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.081601 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.082796 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.082905 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.082986 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.083547 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.083675 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.083638 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.083715 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084074 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084256 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084268 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084278 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084335 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084421 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084460 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084705 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.084967 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085158 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085451 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085615 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085715 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085742 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085939 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085959 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087197 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087052 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.085957 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086073 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086516 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086561 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086584 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086783 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086805 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086635 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087405 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086960 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087427 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087434 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.086995 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.087988 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.089145 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.089870 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.089933 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.090585 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.090703 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.090773 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.091035 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.091093 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.091481 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.091584 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.091990 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.092270 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.092398 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.092480 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.092869 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.092907 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093057 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093090 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093270 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093367 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093558 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.093899 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.094097 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.094180 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.094237 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.094337 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:29.594310789 +0000 UTC m=+21.309364167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.094590 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.094856 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.095135 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.095251 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.095528 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.096293 4610 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.096876 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.096976 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.097385 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.097399 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.097646 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.097885 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.098126 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.098751 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.098892 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.099143 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.099626 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.099666 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.100011 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.100130 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.100152 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.100483 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.100564 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.101162 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.101508 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.101998 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.103718 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.105202 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:29.605177614 +0000 UTC m=+21.320231002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.105913 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.111745 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.117728 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.117745 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.118344 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.117508 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.118863 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.119741 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.119964 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.121147 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.121180 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.121261 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:29.621239656 +0000 UTC m=+21.336293044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.121525 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.123147 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.123271 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.123637 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.123658 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.123668 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.123705 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:29.62369577 +0000 UTC m=+21.338749158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.123804 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.123833 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.124442 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.126149 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.127487 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.128356 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.129090 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.133512 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.133717 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.133976 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.134218 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.134304 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.134502 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.134671 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.136177 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.136576 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.136661 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.136795 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.136231 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.137032 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.137214 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.137441 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.137767 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.137796 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.138033 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.138242 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.138542 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.139241 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.139380 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.139628 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.141816 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.142542 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.143009 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.143088 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.144900 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.145647 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.149576 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.149945 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.150422 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.152085 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.155355 4610 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.156459 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.160099 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.163338 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.167343 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.168381 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.169298 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175311 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175766 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175823 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175946 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175962 4610 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175975 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.175989 4610 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176002 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176015 4610 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176027 4610 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176039 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176070 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176081 4610 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176093 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176107 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176118 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176130 4610 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176142 4610 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176263 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176371 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176587 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176637 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176635 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176653 4610 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176714 4610 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176727 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176739 4610 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176749 4610 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176759 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176770 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176781 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176793 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176804 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176813 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176825 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176835 4610 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176624 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176845 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176883 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176897 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176909 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176919 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176932 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176943 4610 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176956 4610 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176966 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176980 4610 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.176994 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177019 4610 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177034 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177061 4610 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177073 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177084 4610 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177096 4610 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177111 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177125 4610 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177139 4610 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177149 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177159 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177170 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177179 4610 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177192 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177205 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177217 4610 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177230 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177240 4610 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177250 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177260 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177271 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177286 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177298 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177311 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177323 4610 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177336 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177347 4610 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177357 4610 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177370 4610 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177383 4610 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177395 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177408 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177420 4610 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177432 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177445 4610 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177458 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177471 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177483 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177493 4610 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177502 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177511 4610 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177520 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177530 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177539 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177549 4610 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177558 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177577 4610 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177587 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177596 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177605 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177614 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177624 4610 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177632 4610 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177642 4610 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177682 4610 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177700 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177709 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177717 4610 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177728 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177736 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177745 4610 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177753 4610 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177766 4610 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177776 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177786 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177796 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177806 4610 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177814 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177822 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177830 4610 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177839 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177847 4610 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177856 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177864 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177873 4610 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177881 4610 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177889 4610 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177897 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177907 4610 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177917 4610 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177926 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177935 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177946 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177959 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177970 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177980 4610 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177989 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.177998 4610 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178006 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178016 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178023 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178031 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178500 4610 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178515 4610 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178525 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178533 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178563 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178573 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178581 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178590 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178598 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178629 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178638 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178646 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178655 4610 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178682 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178691 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178717 4610 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178725 4610 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178735 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.178750 4610 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.180737 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.181742 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.183740 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.184726 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.185933 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.186588 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.187334 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.188366 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.188681 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.188980 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.189950 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.190624 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.196213 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.196892 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.198473 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.199122 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.200185 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.200670 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.201204 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.202392 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.202948 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.207521 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.217415 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.217964 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.218022 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.220929 4610 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728" exitCode=255 Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.220993 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728"} Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.221109 4610 scope.go:117] "RemoveContainer" containerID="ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.229678 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.235261 4610 scope.go:117] "RemoveContainer" containerID="90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.235516 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.235633 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.243785 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.254952 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.267610 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.278839 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.279404 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.279435 4610 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.290694 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.296662 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.304679 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.305544 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.313220 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.316768 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: W1006 08:41:29.321174 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-842c62c17bee9dca5d80d63989f492903ed8873f6c886b97fe8635e7894cb2b4 WatchSource:0}: Error finding container 842c62c17bee9dca5d80d63989f492903ed8873f6c886b97fe8635e7894cb2b4: Status 404 returned error can't find the container with id 842c62c17bee9dca5d80d63989f492903ed8873f6c886b97fe8635e7894cb2b4 Oct 06 08:41:29 crc kubenswrapper[4610]: W1006 08:41:29.322775 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e13d85145fa0259f42ecd0a2b47cdf1061ac472d696687cffdd88b4cdf0d20e2 WatchSource:0}: Error finding container e13d85145fa0259f42ecd0a2b47cdf1061ac472d696687cffdd88b4cdf0d20e2: Status 404 returned error can't find the container with id e13d85145fa0259f42ecd0a2b47cdf1061ac472d696687cffdd88b4cdf0d20e2 Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.333202 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"message\\\":\\\"W1006 08:41:12.338474 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:12.338960 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740072 cert, and key in /tmp/serving-cert-3412719904/serving-signer.crt, /tmp/serving-cert-3412719904/serving-signer.key\\\\nI1006 08:41:12.766579 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:12.769879 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 08:41:12.770077 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:12.771747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3412719904/tls.crt::/tmp/serving-cert-3412719904/tls.key\\\\\\\"\\\\nF1006 08:41:13.031211 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.584743 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.584915 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:41:30.584884731 +0000 UTC m=+22.299938119 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.685786 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.685867 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.685898 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:29 crc kubenswrapper[4610]: I1006 08:41:29.685928 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686180 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686301 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:30.686272493 +0000 UTC m=+22.401325891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686376 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686417 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686462 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686480 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686426 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686547 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:30.68652797 +0000 UTC m=+22.401581358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686553 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686639 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:30.686617142 +0000 UTC m=+22.401670530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686791 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:29 crc kubenswrapper[4610]: E1006 08:41:29.686855 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:30.686844258 +0000 UTC m=+22.401897656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.069646 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.069865 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.225709 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729"} Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.225766 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"842c62c17bee9dca5d80d63989f492903ed8873f6c886b97fe8635e7894cb2b4"} Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.228449 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.237426 4610 scope.go:117] "RemoveContainer" containerID="90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.237668 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6b357dc13e60a0ccfc8a5b8ab152115a461c3fc9f8aa8f1a1e4d72ebe68c9e5a"} Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.237756 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.239205 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78"} Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.239246 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b"} Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.239260 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e13d85145fa0259f42ecd0a2b47cdf1061ac472d696687cffdd88b4cdf0d20e2"} Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.247500 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.263720 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc2ec690b959ae24ef259e3b41065e4adfa627a767efbba51e4153298fa396b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"message\\\":\\\"W1006 08:41:12.338474 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:12.338960 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740072 cert, and key in /tmp/serving-cert-3412719904/serving-signer.crt, /tmp/serving-cert-3412719904/serving-signer.key\\\\nI1006 08:41:12.766579 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:12.769879 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 08:41:12.770077 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:12.771747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3412719904/tls.crt::/tmp/serving-cert-3412719904/tls.key\\\\\\\"\\\\nF1006 08:41:13.031211 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.278013 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.291912 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.309832 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.325214 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.339479 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.353261 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.378210 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.399076 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.414657 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.428812 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.451618 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.467528 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.595606 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.595774 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:41:32.595748795 +0000 UTC m=+24.310802183 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.697103 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.697181 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.697220 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697281 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697373 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:32.697346503 +0000 UTC m=+24.412399921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697372 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697394 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697420 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697447 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697459 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:32.697435006 +0000 UTC m=+24.412488424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: I1006 08:41:30.697278 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697484 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:32.697472467 +0000 UTC m=+24.412525855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697531 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697546 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697555 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:30 crc kubenswrapper[4610]: E1006 08:41:30.697594 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:32.69758517 +0000 UTC m=+24.412638558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.070087 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.070087 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:31 crc kubenswrapper[4610]: E1006 08:41:31.070219 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:31 crc kubenswrapper[4610]: E1006 08:41:31.070251 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.076139 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.076638 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.077966 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.078553 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.079589 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.080274 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.080825 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.081857 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.082507 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.083467 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.083918 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.084970 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.085426 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.086267 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.086907 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.087571 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.088455 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.089025 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.889861 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.894617 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.899588 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.906404 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.918593 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.932532 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.945785 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.961178 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:31 crc kubenswrapper[4610]: I1006 08:41:31.975768 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:31Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.016979 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.031700 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.043777 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.060904 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.069622 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.069763 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.075036 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.085850 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.097654 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.108929 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.121698 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.247139 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491"} Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.266783 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.282579 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.300463 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.317131 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.331960 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.356937 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.371103 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.384734 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.560268 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.579381 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.585884 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.587938 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.604626 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.612928 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.613110 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:41:36.613075308 +0000 UTC m=+28.328128836 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.621056 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.638807 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.661222 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.675533 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.693985 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.707447 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.713669 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.713706 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.713726 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.713747 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713870 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713888 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713897 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713938 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:36.713922996 +0000 UTC m=+28.428976384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713983 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713992 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.713999 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.714019 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:36.714013139 +0000 UTC m=+28.429066527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.714068 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.714090 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:36.714083581 +0000 UTC m=+28.429136979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.714131 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: E1006 08:41:32.714152 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:36.714146832 +0000 UTC m=+28.429200220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.728024 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.745332 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.762808 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.775857 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.788840 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.800703 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.813723 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.834619 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:32 crc kubenswrapper[4610]: I1006 08:41:32.854198 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:33 crc kubenswrapper[4610]: I1006 08:41:33.069524 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:33 crc kubenswrapper[4610]: I1006 08:41:33.069639 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:33 crc kubenswrapper[4610]: E1006 08:41:33.069716 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:33 crc kubenswrapper[4610]: E1006 08:41:33.069827 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.069677 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:34 crc kubenswrapper[4610]: E1006 08:41:34.069805 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.710015 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kf48m"] Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.710332 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.710428 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-d58xp"] Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.711581 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6w5xr"] Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.711721 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.712235 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.713820 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.714227 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.714711 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.718654 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.718673 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.718944 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.719063 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.719184 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.723359 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.723379 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.724894 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.726422 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.728109 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.732281 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kdc9x"] Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.733842 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: W1006 08:41:34.735432 4610 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:41:34 crc kubenswrapper[4610]: E1006 08:41:34.735477 4610 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:41:34 crc kubenswrapper[4610]: W1006 08:41:34.735544 4610 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:41:34 crc kubenswrapper[4610]: E1006 08:41:34.735559 4610 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.759740 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.762726 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.763336 4610 scope.go:117] "RemoveContainer" containerID="90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728" Oct 06 08:41:34 crc kubenswrapper[4610]: E1006 08:41:34.763464 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.781580 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.798782 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.811952 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830361 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a19d05-9838-4c7d-aa2c-e778a2ef0148-proxy-tls\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830407 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q747\" (UniqueName: \"kubernetes.io/projected/03a2c34b-edd9-489b-a8e6-23502cdeb309-kube-api-access-5q747\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830430 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp96d\" (UniqueName: \"kubernetes.io/projected/d1deae4d-39c0-4684-8851-d2e6da166a93-kube-api-access-bp96d\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830460 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-cnibin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830533 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830585 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-conf-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830605 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-os-release\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830732 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-bin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830773 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-multus-certs\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830805 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cnibin\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830825 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-binary-copy\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830857 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-multus\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830914 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-hostroot\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830942 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99a19d05-9838-4c7d-aa2c-e778a2ef0148-rootfs\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.830976 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-system-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831002 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-kubelet\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831026 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-daemon-config\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831110 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1deae4d-39c0-4684-8851-d2e6da166a93-hosts-file\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831151 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-system-cni-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831194 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831224 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831243 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-cni-binary-copy\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831259 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a19d05-9838-4c7d-aa2c-e778a2ef0148-mcd-auth-proxy-config\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831333 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-socket-dir-parent\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831383 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-netns\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831405 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-etc-kubernetes\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831433 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvk2j\" (UniqueName: \"kubernetes.io/projected/99a19d05-9838-4c7d-aa2c-e778a2ef0148-kube-api-access-xvk2j\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831461 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbgrz\" (UniqueName: \"kubernetes.io/projected/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-kube-api-access-hbgrz\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831493 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-os-release\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.831532 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-k8s-cni-cncf-io\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.837814 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.853561 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.869562 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.882529 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.900594 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.921685 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932312 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cnibin\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932377 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-binary-copy\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932399 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-bin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932418 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-multus-certs\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932442 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-multus\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932458 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-hostroot\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932452 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cnibin\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932473 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-system-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932514 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-multus-certs\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932540 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-kubelet\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932535 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-bin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932562 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-hostroot\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932563 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99a19d05-9838-4c7d-aa2c-e778a2ef0148-rootfs\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932583 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-kubelet\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932582 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99a19d05-9838-4c7d-aa2c-e778a2ef0148-rootfs\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932560 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-system-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932563 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-var-lib-cni-multus\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932651 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-daemon-config\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932819 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-system-cni-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932863 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932881 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-system-cni-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932937 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1deae4d-39c0-4684-8851-d2e6da166a93-hosts-file\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932967 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932985 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-cni-binary-copy\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.932991 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1deae4d-39c0-4684-8851-d2e6da166a93-hosts-file\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933000 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a19d05-9838-4c7d-aa2c-e778a2ef0148-mcd-auth-proxy-config\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933074 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvk2j\" (UniqueName: \"kubernetes.io/projected/99a19d05-9838-4c7d-aa2c-e778a2ef0148-kube-api-access-xvk2j\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-socket-dir-parent\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933135 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-netns\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933159 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-etc-kubernetes\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933185 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbgrz\" (UniqueName: \"kubernetes.io/projected/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-kube-api-access-hbgrz\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933211 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-os-release\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933253 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-k8s-cni-cncf-io\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933279 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-netns\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933281 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a19d05-9838-4c7d-aa2c-e778a2ef0148-proxy-tls\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933321 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-cnibin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933336 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q747\" (UniqueName: \"kubernetes.io/projected/03a2c34b-edd9-489b-a8e6-23502cdeb309-kube-api-access-5q747\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933351 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp96d\" (UniqueName: \"kubernetes.io/projected/d1deae4d-39c0-4684-8851-d2e6da166a93-kube-api-access-bp96d\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933365 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933380 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-conf-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933403 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-os-release\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933602 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933697 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a19d05-9838-4c7d-aa2c-e778a2ef0148-mcd-auth-proxy-config\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933812 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-os-release\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933813 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933859 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-socket-dir-parent\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933977 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-cnibin\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934016 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-conf-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934081 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-os-release\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934102 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-etc-kubernetes\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934128 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-cni-binary-copy\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934151 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-cni-dir\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.933257 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-cni-binary-copy\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.934166 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03a2c34b-edd9-489b-a8e6-23502cdeb309-host-run-k8s-cni-cncf-io\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.944293 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a19d05-9838-4c7d-aa2c-e778a2ef0148-proxy-tls\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.958664 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbgrz\" (UniqueName: \"kubernetes.io/projected/33ba0f3b-6a75-44b9-b9ca-75c81656eb4b-kube-api-access-hbgrz\") pod \"multus-additional-cni-plugins-d58xp\" (UID: \"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\") " pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.967686 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvk2j\" (UniqueName: \"kubernetes.io/projected/99a19d05-9838-4c7d-aa2c-e778a2ef0148-kube-api-access-xvk2j\") pod \"machine-config-daemon-6w5xr\" (UID: \"99a19d05-9838-4c7d-aa2c-e778a2ef0148\") " pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.977832 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q747\" (UniqueName: \"kubernetes.io/projected/03a2c34b-edd9-489b-a8e6-23502cdeb309-kube-api-access-5q747\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.988939 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:34 crc kubenswrapper[4610]: I1006 08:41:34.989420 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp96d\" (UniqueName: \"kubernetes.io/projected/d1deae4d-39c0-4684-8851-d2e6da166a93-kube-api-access-bp96d\") pod \"node-resolver-kf48m\" (UID: \"d1deae4d-39c0-4684-8851-d2e6da166a93\") " pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.010739 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.027732 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kf48m" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.040247 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: W1006 08:41:35.040416 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1deae4d_39c0_4684_8851_d2e6da166a93.slice/crio-29a2be40fc92d4efb46524c11740a73394643f981b42f7bd5ff8cfd989062ae8 WatchSource:0}: Error finding container 29a2be40fc92d4efb46524c11740a73394643f981b42f7bd5ff8cfd989062ae8: Status 404 returned error can't find the container with id 29a2be40fc92d4efb46524c11740a73394643f981b42f7bd5ff8cfd989062ae8 Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.042729 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d58xp" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.052632 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.071383 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.071505 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.071580 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.071804 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.073928 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.121419 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.148677 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.166930 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.192005 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.259345 4610 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.266796 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.280892 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.280929 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.280938 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.281027 4610 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.288540 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kf48m" event={"ID":"d1deae4d-39c0-4684-8851-d2e6da166a93","Type":"ContainerStarted","Data":"29a2be40fc92d4efb46524c11740a73394643f981b42f7bd5ff8cfd989062ae8"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.296095 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.296333 4610 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.296809 4610 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.296948 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"1e6d64e2cc253de0314e6fbfbd686203e3d01cc911a333b499d0af1b7cc09b71"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.297756 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerStarted","Data":"7593ca9e94d18c2ac730362de918499d3fcffb933b88ea58a17043a4e420cf62"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.297787 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.297904 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.298527 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.298597 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.298653 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.329234 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.386541 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.389031 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.403994 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.405247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.405265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.405274 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.405288 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.405298 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.424068 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.429777 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.429816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.429828 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.429845 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.429858 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.455591 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.459427 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.459464 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.459477 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.459492 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.459505 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.482386 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.487507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.487543 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.487553 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.487569 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.487581 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.502681 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: E1006 08:41:35.502802 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.504244 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.504263 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.504271 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.504291 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.504302 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.567320 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pqkpj"] Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.568357 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.572155 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.575409 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.575546 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.575801 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.577600 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.577820 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.578826 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.590920 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.603477 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.606215 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.606247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.606259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.606277 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.606290 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.615481 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.629768 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644467 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644519 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644544 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644628 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644669 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644700 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644720 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644743 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fv5s\" (UniqueName: \"kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644762 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644782 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644817 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644848 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644870 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644891 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644918 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644939 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644961 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.644983 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.645007 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.645030 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.647440 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.667980 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.686786 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.689971 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.693794 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03a2c34b-edd9-489b-a8e6-23502cdeb309-multus-daemon-config\") pod \"multus-kdc9x\" (UID: \"03a2c34b-edd9-489b-a8e6-23502cdeb309\") " pod="openshift-multus/multus-kdc9x" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.701726 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.708799 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.708826 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.708835 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.708850 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.708859 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.717646 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.730581 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.741598 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746077 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746113 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746134 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746159 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746175 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746190 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746206 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746221 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746238 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746253 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746280 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746295 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746319 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746340 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746355 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746368 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fv5s\" (UniqueName: \"kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746382 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746396 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746419 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.746992 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747050 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747083 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747134 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747138 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747146 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747159 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747180 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747177 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747223 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747206 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747254 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747229 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747309 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747344 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747518 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.747832 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.751320 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.755309 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.764370 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fv5s\" (UniqueName: \"kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s\") pod \"ovnkube-node-pqkpj\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.767592 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.768780 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.780731 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.810693 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.810738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.810748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.810764 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.810775 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.879671 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:35 crc kubenswrapper[4610]: W1006 08:41:35.893755 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980266ef_4c63_4532_8b33_25fa1c57a9a7.slice/crio-f1c4faa0cea60bcd12158a7e4cbcf44a7f85e5b1df6d5dc30782b215946f9f9b WatchSource:0}: Error finding container f1c4faa0cea60bcd12158a7e4cbcf44a7f85e5b1df6d5dc30782b215946f9f9b: Status 404 returned error can't find the container with id f1c4faa0cea60bcd12158a7e4cbcf44a7f85e5b1df6d5dc30782b215946f9f9b Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.914343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.914368 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.914376 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.914390 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.914398 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:35Z","lastTransitionTime":"2025-10-06T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:35 crc kubenswrapper[4610]: I1006 08:41:35.959318 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdc9x" Oct 06 08:41:35 crc kubenswrapper[4610]: W1006 08:41:35.971269 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a2c34b_edd9_489b_a8e6_23502cdeb309.slice/crio-aa12655d959599dffbba7b6e0aabd868919b4dcbea2ec7e9b838498b352cf02d WatchSource:0}: Error finding container aa12655d959599dffbba7b6e0aabd868919b4dcbea2ec7e9b838498b352cf02d: Status 404 returned error can't find the container with id aa12655d959599dffbba7b6e0aabd868919b4dcbea2ec7e9b838498b352cf02d Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.016362 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.016389 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.016399 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.016415 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.016425 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.069558 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.069684 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.118831 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.118873 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.118886 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.118902 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.118913 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.221487 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.221532 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.221543 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.221560 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.221574 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.302081 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" exitCode=0 Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.302227 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.302624 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"f1c4faa0cea60bcd12158a7e4cbcf44a7f85e5b1df6d5dc30782b215946f9f9b"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.308003 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.308063 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.313215 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c" exitCode=0 Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.313306 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.315602 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerStarted","Data":"35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.315658 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerStarted","Data":"aa12655d959599dffbba7b6e0aabd868919b4dcbea2ec7e9b838498b352cf02d"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.317815 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kf48m" event={"ID":"d1deae4d-39c0-4684-8851-d2e6da166a93","Type":"ContainerStarted","Data":"c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.320712 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.324363 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.324397 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.324408 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.324425 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.324436 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.343930 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.358935 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.384010 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.401203 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.414634 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.427378 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.427953 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.427974 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.427984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.428001 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.428016 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.441361 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.466449 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.480960 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.494996 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.506286 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.517651 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530180 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530832 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.530873 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.541791 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.557680 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.576483 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.599586 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.613629 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.624636 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.633019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.633080 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.633095 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.633112 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.633459 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.635300 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.647936 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.656423 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.656852 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:41:44.656829094 +0000 UTC m=+36.371882492 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.658825 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.670906 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.687414 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.701822 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.724155 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.735273 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.735301 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.735309 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.735324 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.735332 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.739225 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.757929 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.758000 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.758031 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.758076 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758207 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758225 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758237 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758288 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:44.758268848 +0000 UTC m=+36.473322236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758637 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758659 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758668 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758708 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:44.758698459 +0000 UTC m=+36.473751847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758746 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758771 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:44.758764051 +0000 UTC m=+36.473817439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758819 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: E1006 08:41:36.758850 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:41:44.758838873 +0000 UTC m=+36.473892261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.836913 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.837409 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.837420 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.837440 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.837453 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.940142 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.940423 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.940507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.940662 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:36 crc kubenswrapper[4610]: I1006 08:41:36.940746 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:36Z","lastTransitionTime":"2025-10-06T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.044705 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.044757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.044775 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.044795 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.044811 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.069509 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.069582 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:37 crc kubenswrapper[4610]: E1006 08:41:37.069634 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:37 crc kubenswrapper[4610]: E1006 08:41:37.069758 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.146765 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.146800 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.146808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.146825 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.146840 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.249998 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.250058 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.250067 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.250081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.250089 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.322608 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerStarted","Data":"4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.325654 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.325690 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.325704 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.325746 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.340971 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.352909 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.353121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.353149 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.353163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.353171 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.356641 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.371136 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.389310 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.401317 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.420144 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.441810 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.455262 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.455309 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.455321 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.455346 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.455358 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.458023 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.474014 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.478770 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-v8tw6"] Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.479209 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.481602 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.483285 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.483746 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.484057 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.495600 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.509649 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.524382 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.537776 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.548632 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.557926 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.557973 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.557984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.558002 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.558013 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.562504 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.566923 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-host\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.566965 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-serviceca\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.567011 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8vw\" (UniqueName: \"kubernetes.io/projected/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-kube-api-access-7m8vw\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.576024 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.588801 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.598567 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.618391 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.633673 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.646877 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660101 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660175 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660194 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660105 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660261 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.660280 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.668088 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-host\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.668134 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-serviceca\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.668193 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8vw\" (UniqueName: \"kubernetes.io/projected/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-kube-api-access-7m8vw\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.668223 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-host\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.669600 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-serviceca\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.676701 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.689970 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8vw\" (UniqueName: \"kubernetes.io/projected/2d96b4ed-90ec-4ea5-b244-55c3b8f55def-kube-api-access-7m8vw\") pod \"node-ca-v8tw6\" (UID: \"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\") " pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.691898 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.704666 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.719399 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.731265 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.753630 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.763768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.763794 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.763802 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.763816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.763824 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.767613 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.815762 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8tw6" Oct 06 08:41:37 crc kubenswrapper[4610]: W1006 08:41:37.829502 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d96b4ed_90ec_4ea5_b244_55c3b8f55def.slice/crio-b60eee856b59e8de84b54ff87c754bd3dab917b8244d6792f93d82280abc6f9d WatchSource:0}: Error finding container b60eee856b59e8de84b54ff87c754bd3dab917b8244d6792f93d82280abc6f9d: Status 404 returned error can't find the container with id b60eee856b59e8de84b54ff87c754bd3dab917b8244d6792f93d82280abc6f9d Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.866257 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.866363 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.866377 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.866395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.866672 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.969227 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.969267 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.969279 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.969297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:37 crc kubenswrapper[4610]: I1006 08:41:37.969310 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:37Z","lastTransitionTime":"2025-10-06T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.069869 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:38 crc kubenswrapper[4610]: E1006 08:41:38.070491 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.071538 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.071572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.071581 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.071596 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.071605 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.173403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.173438 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.173448 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.173462 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.173474 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.276398 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.276436 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.276448 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.276466 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.276480 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.331660 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5" exitCode=0 Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.331804 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.334643 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8tw6" event={"ID":"2d96b4ed-90ec-4ea5-b244-55c3b8f55def","Type":"ContainerStarted","Data":"c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.334693 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8tw6" event={"ID":"2d96b4ed-90ec-4ea5-b244-55c3b8f55def","Type":"ContainerStarted","Data":"b60eee856b59e8de84b54ff87c754bd3dab917b8244d6792f93d82280abc6f9d"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.340287 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.340331 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.357061 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.378463 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.379950 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.379986 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.379995 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.380012 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.380023 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.393087 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.411086 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.427402 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.446811 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.469254 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.480760 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.482121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.482144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.482154 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.482171 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.482181 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.493884 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.507750 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.517174 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.529631 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.539921 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.561409 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.573892 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.584956 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.585008 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.585018 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.585034 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.585077 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.587861 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.597097 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.617721 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.632273 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.643311 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.651588 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.663081 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.676607 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.687035 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.687084 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.687095 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.687117 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.687127 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.690382 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.705036 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.718550 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.735152 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.749003 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.764037 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.777246 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.788704 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.788733 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.788743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.788757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.788768 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.891101 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.891142 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.891151 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.891166 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.891176 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.993259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.993303 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.993316 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.993332 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:38 crc kubenswrapper[4610]: I1006 08:41:38.993343 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:38Z","lastTransitionTime":"2025-10-06T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.069587 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.069625 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:39 crc kubenswrapper[4610]: E1006 08:41:39.069729 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:39 crc kubenswrapper[4610]: E1006 08:41:39.070017 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.089889 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.095334 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.095374 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.095386 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.095403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.095412 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.102751 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.112328 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.128735 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.138888 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.154464 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.163079 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.173792 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.189590 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.198694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.198732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.198743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.198762 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.198774 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.203503 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.222164 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.232572 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.247854 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.259852 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.269737 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.301439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.301478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.301488 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.301503 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.301513 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.350446 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b" exitCode=0 Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.350488 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.361873 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.372949 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.394163 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.411646 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.426268 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.426307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.426316 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.426331 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.426342 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.434857 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.463796 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.499100 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.512170 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.525118 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.528686 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.528708 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.528719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.528737 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.528748 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.544172 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.562724 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.579310 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.595412 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.609242 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.626022 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.633398 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.633596 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.633660 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.633736 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.633790 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.736507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.736549 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.736561 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.736579 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.736591 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.839259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.839306 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.839318 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.839339 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.839352 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.942253 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.942580 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.942661 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.942750 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:39 crc kubenswrapper[4610]: I1006 08:41:39.942835 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:39Z","lastTransitionTime":"2025-10-06T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.047513 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.048080 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.048096 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.048124 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.048148 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.070166 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:40 crc kubenswrapper[4610]: E1006 08:41:40.070429 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.151004 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.151065 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.151075 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.151093 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.151103 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.254195 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.254272 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.254302 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.254319 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.254330 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.364657 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.364935 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.365020 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.365175 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.365262 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.368436 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.371484 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54" exitCode=0 Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.371522 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.389591 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.402711 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.415407 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.430296 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.444008 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.463490 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.468741 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.468767 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.468778 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.468797 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.468814 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.484030 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.501372 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.514755 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.529428 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.539793 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.552298 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.564700 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.575065 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.575631 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.575646 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.575664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.575676 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.577018 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.589427 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.679309 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.679351 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.679362 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.679380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.679393 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.781769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.781821 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.781829 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.781842 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.781851 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.884603 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.884663 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.884686 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.884714 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.884735 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.987719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.987781 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.987793 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.987812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:40 crc kubenswrapper[4610]: I1006 08:41:40.987821 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:40Z","lastTransitionTime":"2025-10-06T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.069854 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:41 crc kubenswrapper[4610]: E1006 08:41:41.069995 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.070287 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:41 crc kubenswrapper[4610]: E1006 08:41:41.070383 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.092634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.092686 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.092701 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.092722 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.092736 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.195562 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.196095 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.196107 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.196126 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.196135 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.298985 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.299034 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.299061 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.299078 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.299090 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.377230 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8" exitCode=0 Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.377288 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.395019 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.402181 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.402224 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.402235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.402251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.402262 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.412938 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.426316 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.448014 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.467717 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.483582 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.500235 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.504508 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.504553 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.504565 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.504584 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.504598 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.514657 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.525878 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.558760 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.572759 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.586799 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.596876 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.606899 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.606924 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.606931 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.606945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.606954 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.610950 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.623761 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.709585 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.709707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.709803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.709898 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.710011 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.812514 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.812566 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.812580 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.812598 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.812618 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.915578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.915609 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.915620 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.915636 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:41 crc kubenswrapper[4610]: I1006 08:41:41.915647 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:41Z","lastTransitionTime":"2025-10-06T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.017676 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.017714 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.017724 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.017738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.017748 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.069589 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:42 crc kubenswrapper[4610]: E1006 08:41:42.069698 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.121171 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.121201 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.121211 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.121227 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.121237 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.224248 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.224288 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.224297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.224315 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.224325 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.327026 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.327120 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.327161 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.327182 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.327195 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.389001 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.389424 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.393632 4610 generic.go:334] "Generic (PLEG): container finished" podID="33ba0f3b-6a75-44b9-b9ca-75c81656eb4b" containerID="972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45" exitCode=0 Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.393694 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerDied","Data":"972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.404830 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431458 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431864 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.431875 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.445867 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.473181 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.488598 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.488984 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.502506 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.514474 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.527454 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.535095 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.535126 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.535141 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.535160 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.535172 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.538207 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.557528 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.571960 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.586752 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.597316 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.611140 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.624782 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.638863 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.638925 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.638942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.640667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.640692 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.640805 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.654674 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.667376 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.682006 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.692643 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.708678 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.725816 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.736766 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.746299 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.752318 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.752489 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.752565 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.752680 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.752762 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.758247 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.766850 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.779402 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.792163 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.803509 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.812980 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.854897 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.855225 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.855350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.855576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.855791 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.958102 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.958340 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.958397 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.958455 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:42 crc kubenswrapper[4610]: I1006 08:41:42.958509 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:42Z","lastTransitionTime":"2025-10-06T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.064621 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.064660 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.064669 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.064684 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.064694 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.069429 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:43 crc kubenswrapper[4610]: E1006 08:41:43.069535 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.069436 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:43 crc kubenswrapper[4610]: E1006 08:41:43.069673 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.167525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.167858 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.167866 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.167879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.167889 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.271019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.271092 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.271108 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.271130 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.271173 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.374581 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.374615 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.374625 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.374638 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.374647 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.402820 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.402804 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" event={"ID":"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b","Type":"ContainerStarted","Data":"64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.403014 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.420205 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.432437 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.439920 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.457947 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.477495 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.477542 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.477557 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.477578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.477594 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.481841 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.493495 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.507121 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.517992 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.535082 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.554119 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.568199 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.579794 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.579839 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.579851 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.579869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.579880 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.586088 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.601530 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.620461 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.632710 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.643128 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.653430 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.661891 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.681621 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.681657 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.681666 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.681682 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.681693 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.682512 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.693678 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.707236 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.721123 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.737900 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.756538 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.769236 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.781879 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.784035 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.784090 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.784107 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.784124 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.784133 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.802496 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.821497 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.838016 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.854343 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.868860 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.886657 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.886729 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.886747 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.886773 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.886791 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.989576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.989614 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.989626 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.989643 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:43 crc kubenswrapper[4610]: I1006 08:41:43.989654 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:43Z","lastTransitionTime":"2025-10-06T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.069783 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.069905 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.091930 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.091978 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.091989 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.092003 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.092013 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.193950 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.193986 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.193998 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.194014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.194022 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.296432 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.296495 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.296508 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.296525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.296536 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.399699 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.399748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.399763 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.399783 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.399799 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.405569 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.502329 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.502363 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.502372 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.502391 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.502400 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.605691 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.605749 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.605766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.605791 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.605808 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.708276 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.708332 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.708345 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.708364 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.708381 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.745679 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.745911 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:00.745891236 +0000 UTC m=+52.460944634 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.812232 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.812295 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.812308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.812331 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.812345 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.847293 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.847389 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.847428 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.847485 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847550 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847595 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847612 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847653 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847679 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847684 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:00.847656688 +0000 UTC m=+52.562710096 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847700 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847770 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:00.847746251 +0000 UTC m=+52.562799789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847859 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847897 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:00.847884684 +0000 UTC m=+52.562938262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.847966 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: E1006 08:41:44.848005 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:00.847992837 +0000 UTC m=+52.563046435 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.915959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.916071 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.916099 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.916126 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:44 crc kubenswrapper[4610]: I1006 08:41:44.916146 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:44Z","lastTransitionTime":"2025-10-06T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.020209 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.020312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.020324 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.020339 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.020348 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.069986 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.070020 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.070163 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.070335 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.122976 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.123027 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.123037 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.123085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.123098 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.225515 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.225617 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.225643 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.226027 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.226242 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.328662 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.328696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.328703 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.328718 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.328727 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.407844 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.431686 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.431738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.431750 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.431769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.431786 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.541301 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.541342 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.541351 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.541364 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.541373 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.644432 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.644493 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.644507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.644533 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.644547 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.699020 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.699119 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.699137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.699165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.699184 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.712034 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.716733 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.716775 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.716788 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.716808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.716825 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.752054 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.762424 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.762461 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.762472 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.762489 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.762499 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.785805 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.791122 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.791158 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.791172 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.791193 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.791208 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.809644 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.815349 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.815378 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.815387 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.815404 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.815417 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.837542 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:45 crc kubenswrapper[4610]: E1006 08:41:45.837719 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.839127 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.839152 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.839163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.839182 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.839193 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.941321 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.941366 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.941377 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.941395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:45 crc kubenswrapper[4610]: I1006 08:41:45.941408 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:45Z","lastTransitionTime":"2025-10-06T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.044265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.044338 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.044357 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.044386 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.044406 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.069463 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:46 crc kubenswrapper[4610]: E1006 08:41:46.069681 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.147303 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.147343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.147353 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.147367 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.147378 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.250235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.250281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.250294 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.250313 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.250324 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.354139 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.354193 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.354206 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.354235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.354249 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.413272 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/0.log" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.416245 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf" exitCode=1 Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.416311 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.417218 4610 scope.go:117] "RemoveContainer" containerID="9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.440523 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456216 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456454 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456487 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456499 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456517 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.456533 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.468011 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.483553 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.499565 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.514963 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.528269 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.546351 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.559177 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.559299 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.559360 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.559450 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.559510 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.566439 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.586597 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.608658 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.625305 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.640536 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.655553 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.662854 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.662889 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.662900 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.662919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.662930 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.673791 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.765868 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.765917 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.765933 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.765953 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.765963 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.873776 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.873817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.873825 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.873843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.873854 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.976285 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.976343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.976362 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.976386 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:46 crc kubenswrapper[4610]: I1006 08:41:46.976398 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:46Z","lastTransitionTime":"2025-10-06T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.069680 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.069726 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:47 crc kubenswrapper[4610]: E1006 08:41:47.069831 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:47 crc kubenswrapper[4610]: E1006 08:41:47.069958 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.078186 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.078222 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.078231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.078244 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.078253 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.180908 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.180955 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.180968 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.180988 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.181003 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.283135 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.283171 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.283179 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.283195 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.283205 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.407496 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.407541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.407555 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.407581 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.407605 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.422182 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/0.log" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.425216 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.425360 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.444772 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.467690 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.480888 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.494026 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.505267 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.509348 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.509380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.509413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.509431 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.509442 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.524388 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.539855 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh"] Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.540448 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.542761 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.543365 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.546296 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.561682 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.575581 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.589791 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.600097 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611681 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611690 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611705 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611720 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.611691 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.624015 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.640229 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.651036 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.662703 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.680332 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.684372 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nxj5\" (UniqueName: \"kubernetes.io/projected/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-kube-api-access-9nxj5\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.684428 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.684455 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.684508 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.695351 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715279 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715476 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715513 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715524 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.715552 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.734608 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.746117 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.759062 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.771968 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.783182 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.785116 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.785150 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.785195 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.785243 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nxj5\" (UniqueName: \"kubernetes.io/projected/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-kube-api-access-9nxj5\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.785871 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.786092 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.793074 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.799861 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.806345 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nxj5\" (UniqueName: \"kubernetes.io/projected/3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1-kube-api-access-9nxj5\") pod \"ovnkube-control-plane-749d76644c-mpbkh\" (UID: \"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.812676 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.818767 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.818942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.819008 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.819108 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.819197 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.830911 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.842226 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.852830 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.855197 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: W1006 08:41:47.868284 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb2ae6e_2e9c_49b3_a7b3_2d9037d563d1.slice/crio-936d333b853d56d1fb0ea5735cbe6767a82093c65c00f56c8bf9ddb3df16f092 WatchSource:0}: Error finding container 936d333b853d56d1fb0ea5735cbe6767a82093c65c00f56c8bf9ddb3df16f092: Status 404 returned error can't find the container with id 936d333b853d56d1fb0ea5735cbe6767a82093c65c00f56c8bf9ddb3df16f092 Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.870645 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.885828 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.921954 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.922008 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.922019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.922062 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:47 crc kubenswrapper[4610]: I1006 08:41:47.922102 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:47Z","lastTransitionTime":"2025-10-06T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.024243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.024307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.024320 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.024340 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.024353 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.070193 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:48 crc kubenswrapper[4610]: E1006 08:41:48.070379 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.071168 4610 scope.go:117] "RemoveContainer" containerID="90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.126748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.126808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.126817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.126830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.126839 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.236077 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.236111 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.236121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.236137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.236147 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.338877 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.338912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.338923 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.338941 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.338953 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.429304 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.430845 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.431276 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.432560 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/1.log" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.433002 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/0.log" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.435344 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814" exitCode=1 Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.435413 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.435465 4610 scope.go:117] "RemoveContainer" containerID="9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.436235 4610 scope.go:117] "RemoveContainer" containerID="e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814" Oct 06 08:41:48 crc kubenswrapper[4610]: E1006 08:41:48.436433 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.437271 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" event={"ID":"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1","Type":"ContainerStarted","Data":"4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.437380 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" event={"ID":"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1","Type":"ContainerStarted","Data":"675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.437440 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" event={"ID":"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1","Type":"ContainerStarted","Data":"936d333b853d56d1fb0ea5735cbe6767a82093c65c00f56c8bf9ddb3df16f092"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.441098 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.441129 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.441141 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.441156 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.441168 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.450627 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.465255 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.479497 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.494663 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.507643 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.521670 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.537018 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.542987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.543115 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.543191 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.543280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.543347 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.560842 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.580841 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.602411 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.615132 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.629825 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.645871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.645900 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.645908 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.645920 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.645930 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.656565 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.668117 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.687253 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.702855 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.717583 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.730666 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.742410 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.748313 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.748392 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.748417 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.748458 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.748478 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.766611 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.782876 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.799350 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.811968 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.824183 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.837208 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.849152 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.851662 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.851711 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.851725 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.851744 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.851755 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.865431 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.877367 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.898858 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.911722 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.924837 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.938330 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.954434 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.954488 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.954501 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.954522 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:48 crc kubenswrapper[4610]: I1006 08:41:48.954537 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:48Z","lastTransitionTime":"2025-10-06T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.014453 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-46wzl"] Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.014999 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.015114 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.031327 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.045013 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.056861 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.056906 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.056918 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.056935 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.056948 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.060321 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.070026 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.070026 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.070174 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.070233 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.080948 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.092751 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.103903 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.103936 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl7kk\" (UniqueName: \"kubernetes.io/projected/a62060d4-5efa-4c4f-851d-8738476f690e-kube-api-access-wl7kk\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.108578 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.118674 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.138229 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.150162 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.158666 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.158696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.158717 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.158730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.158740 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.162307 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.175907 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.188563 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.201167 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.205167 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.205200 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl7kk\" (UniqueName: \"kubernetes.io/projected/a62060d4-5efa-4c4f-851d-8738476f690e-kube-api-access-wl7kk\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.205490 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.205558 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:41:49.705538722 +0000 UTC m=+41.420592110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.216826 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.226916 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl7kk\" (UniqueName: \"kubernetes.io/projected/a62060d4-5efa-4c4f-851d-8738476f690e-kube-api-access-wl7kk\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.248367 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.260534 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.260559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.260569 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.260584 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.260595 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.269377 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.282006 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.293527 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.302614 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.317103 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.328023 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.338402 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.351384 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.364403 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.366365 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.366641 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.366652 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.366669 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.366679 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.380458 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9705f2a3f79fc030b2afb41ed544c2eea8804d6d435043eab3150a736d2abedf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:45Z\\\",\\\"message\\\":\\\"41:45.842891 5823 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 08:41:45.843029 5823 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:41:45.843110 5823 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:41:45.843140 5823 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:41:45.843174 5823 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 08:41:45.843199 5823 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 08:41:45.843219 5823 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:41:45.843214 5823 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:41:45.843238 5823 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:41:45.843246 5823 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:41:45.843284 5823 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 08:41:45.843316 5823 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:41:45.843354 5823 factory.go:656] Stopping watch factory\\\\nI1006 08:41:45.843388 5823 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.397006 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.408688 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.418585 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.428848 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.437020 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.441926 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/1.log" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.452815 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.463573 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.468314 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.468344 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.468354 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.468370 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.468380 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.476628 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.488068 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.571254 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.571288 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.571298 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.571313 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.571323 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.673649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.673685 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.673695 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.673711 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.673722 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.711937 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.712197 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:49 crc kubenswrapper[4610]: E1006 08:41:49.712271 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:41:50.712245807 +0000 UTC m=+42.427299205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.775619 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.775643 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.775650 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.775664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.775674 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.879188 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.879241 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.879255 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.879276 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.879291 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.982668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.982706 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.982715 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.982732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:49 crc kubenswrapper[4610]: I1006 08:41:49.982746 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:49Z","lastTransitionTime":"2025-10-06T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.069595 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:50 crc kubenswrapper[4610]: E1006 08:41:50.069790 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.086177 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.086252 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.086271 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.086297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.086318 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.189304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.189380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.189394 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.189419 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.189435 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.291974 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.292019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.292030 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.292068 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.292080 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.394886 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.394945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.394956 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.394977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.394990 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.498168 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.498204 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.498216 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.498232 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.498243 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.601768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.601823 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.601843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.601867 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.601887 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.704738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.704780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.704788 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.704805 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.704814 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.723451 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:50 crc kubenswrapper[4610]: E1006 08:41:50.723620 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:50 crc kubenswrapper[4610]: E1006 08:41:50.723728 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:41:52.723706758 +0000 UTC m=+44.438760146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.806879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.806959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.806984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.807010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.807029 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.909811 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.909845 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.909854 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.909873 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:50 crc kubenswrapper[4610]: I1006 08:41:50.909885 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:50Z","lastTransitionTime":"2025-10-06T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.012218 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.012243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.012250 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.012265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.012273 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.069624 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.069699 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.069713 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:51 crc kubenswrapper[4610]: E1006 08:41:51.069871 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:51 crc kubenswrapper[4610]: E1006 08:41:51.070208 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:51 crc kubenswrapper[4610]: E1006 08:41:51.070291 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.115540 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.115603 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.115621 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.115644 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.115662 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.218440 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.218850 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.219013 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.219222 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.219409 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.322359 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.322731 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.322984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.323162 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.323309 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.425928 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.425978 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.425987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.426003 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.426013 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.529072 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.529142 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.529168 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.529182 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.529191 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.632308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.632391 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.632416 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.632456 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.632481 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.735081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.735131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.735144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.735164 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.735178 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.843558 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.843656 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.843674 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.843700 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.843719 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.948279 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.948349 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.948367 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.948395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:51 crc kubenswrapper[4610]: I1006 08:41:51.948413 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:51Z","lastTransitionTime":"2025-10-06T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.051216 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.051270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.051287 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.051312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.051329 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.069524 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:52 crc kubenswrapper[4610]: E1006 08:41:52.069735 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.154096 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.154135 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.154146 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.154167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.154178 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.257555 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.257610 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.257620 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.257634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.257644 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.359538 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.359576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.359588 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.359605 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.359619 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.462145 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.462213 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.462229 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.462251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.462268 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.565881 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.565937 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.565951 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.565970 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.565982 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.669843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.669965 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.669984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.670010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.670028 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.746306 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:52 crc kubenswrapper[4610]: E1006 08:41:52.746461 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:52 crc kubenswrapper[4610]: E1006 08:41:52.746526 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:41:56.746507294 +0000 UTC m=+48.461560682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.773337 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.773388 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.773405 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.773427 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.773445 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.877484 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.877537 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.877552 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.877570 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.877586 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.979771 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.979813 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.979830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.979848 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:52 crc kubenswrapper[4610]: I1006 08:41:52.979861 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:52Z","lastTransitionTime":"2025-10-06T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.070528 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.070649 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.070560 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:53 crc kubenswrapper[4610]: E1006 08:41:53.070789 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:53 crc kubenswrapper[4610]: E1006 08:41:53.070892 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:53 crc kubenswrapper[4610]: E1006 08:41:53.070989 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.083312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.083361 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.083371 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.083400 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.083414 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.187416 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.187498 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.187523 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.187548 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.187570 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.290465 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.290556 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.290586 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.290615 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.290637 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.394459 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.394516 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.394528 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.394547 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.394562 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.498822 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.498893 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.498908 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.498934 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.498950 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.603162 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.603223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.603235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.603270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.603286 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.705300 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.705350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.705364 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.705385 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.705400 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.808000 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.808075 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.808084 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.808099 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.808110 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.910390 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.910453 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.910474 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.910515 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:53 crc kubenswrapper[4610]: I1006 08:41:53.910554 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:53Z","lastTransitionTime":"2025-10-06T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.013112 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.013151 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.013162 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.013184 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.013197 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.070229 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:54 crc kubenswrapper[4610]: E1006 08:41:54.070378 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.115922 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.115963 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.115973 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.115989 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.116003 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.229867 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.229919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.229930 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.229947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.229967 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.332745 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.332803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.332820 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.332842 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.332856 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.435676 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.435715 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.435732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.435746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.435755 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.562807 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.562879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.562897 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.563280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.563332 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.667439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.667478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.667488 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.667506 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.667516 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.769417 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.769474 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.769485 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.769503 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.769514 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.872662 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.873000 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.873115 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.873210 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.873301 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.976097 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.976135 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.976146 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.976163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:54 crc kubenswrapper[4610]: I1006 08:41:54.976174 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:54Z","lastTransitionTime":"2025-10-06T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.069963 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:55 crc kubenswrapper[4610]: E1006 08:41:55.070180 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.069987 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:55 crc kubenswrapper[4610]: E1006 08:41:55.070354 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.069988 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:55 crc kubenswrapper[4610]: E1006 08:41:55.070440 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.078312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.078582 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.078648 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.078731 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.078814 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.181261 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.181299 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.181307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.181323 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.181333 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.284167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.284206 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.284217 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.284232 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.284243 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.387019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.387091 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.387103 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.387122 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.387133 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.489987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.490031 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.490065 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.490082 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.490093 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.593213 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.593262 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.593274 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.593293 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.593305 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.696159 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.696215 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.696231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.696254 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.696270 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.798713 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.798754 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.798767 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.798817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.798829 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.901663 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.901730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.901746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.901768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:55 crc kubenswrapper[4610]: I1006 08:41:55.901783 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:55Z","lastTransitionTime":"2025-10-06T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.004705 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.004772 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.004789 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.004814 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.004833 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.070373 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.070588 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.106836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.106882 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.106894 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.106912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.106925 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.151947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.152014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.152024 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.152040 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.152066 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.168889 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.173606 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.173657 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.173668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.173687 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.173701 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.187593 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.192335 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.192379 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.192397 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.192416 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.192429 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.210451 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.214218 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.214281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.214292 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.214311 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.214323 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.226861 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.230847 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.230885 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.230896 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.230912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.230924 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.244671 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.244797 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.246925 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.246972 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.246983 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.246999 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.247012 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.349396 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.349439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.349451 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.349468 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.349480 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.452311 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.452374 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.452385 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.452405 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.452418 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.559395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.559468 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.559481 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.559501 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.559517 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.662780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.662856 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.662874 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.662893 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.662907 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.765871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.765908 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.765917 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.765929 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.765948 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.787917 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.788132 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.788204 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:42:04.788181964 +0000 UTC m=+56.503235362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.868454 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.868490 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.868500 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.868516 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.868527 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.906328 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.908132 4610 scope.go:117] "RemoveContainer" containerID="e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814" Oct 06 08:41:56 crc kubenswrapper[4610]: E1006 08:41:56.908275 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.929096 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.947728 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.967287 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.971165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.971256 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.971275 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.971300 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.971319 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:56Z","lastTransitionTime":"2025-10-06T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:56 crc kubenswrapper[4610]: I1006 08:41:56.982648 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:56Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.006661 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.021485 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.032136 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.046879 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.070568 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.070630 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.070674 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:57 crc kubenswrapper[4610]: E1006 08:41:57.070848 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:57 crc kubenswrapper[4610]: E1006 08:41:57.070953 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:57 crc kubenswrapper[4610]: E1006 08:41:57.071076 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.072110 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.073283 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.073324 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.073335 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.073350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.073360 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.086670 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.105546 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.116538 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.134778 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.147702 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.160989 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.175753 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.175780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.175789 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.175802 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.175811 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.176008 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.187782 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:57Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.279500 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.279572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.279590 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.279616 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.279635 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.383109 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.383459 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.383624 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.383790 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.383892 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.487176 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.487239 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.487256 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.487281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.487298 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.590697 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.590780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.590800 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.590828 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.590847 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.694111 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.694211 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.694229 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.694285 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.694305 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.797456 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.797532 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.797555 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.797583 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.797603 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.900772 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.900823 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.900837 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.900856 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:57 crc kubenswrapper[4610]: I1006 08:41:57.900869 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:57Z","lastTransitionTime":"2025-10-06T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.003869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.003939 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.003955 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.003981 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.003999 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.069881 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:41:58 crc kubenswrapper[4610]: E1006 08:41:58.070115 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.111117 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.111197 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.111208 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.111226 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.111238 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.214487 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.214541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.214553 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.214571 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.214582 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.317443 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.317756 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.317860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.318007 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.318146 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.420592 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.420642 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.420654 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.420671 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.420684 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.523803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.523852 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.523862 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.523882 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.523899 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.627000 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.627063 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.627075 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.627093 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.627104 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.729816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.729869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.729884 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.729903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.729914 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.833240 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.834514 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.834560 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.834609 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.834636 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.937296 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.937347 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.937355 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.937368 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:58 crc kubenswrapper[4610]: I1006 08:41:58.937378 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:58Z","lastTransitionTime":"2025-10-06T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.040222 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.040263 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.040276 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.040294 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.040306 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.069434 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.069484 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:41:59 crc kubenswrapper[4610]: E1006 08:41:59.069729 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.069797 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:41:59 crc kubenswrapper[4610]: E1006 08:41:59.069941 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:41:59 crc kubenswrapper[4610]: E1006 08:41:59.070108 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.082020 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.102644 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.115515 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.126788 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.143606 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.143645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.143654 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.143668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.143713 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.148021 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.163219 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.189320 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.206484 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.228140 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.242573 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.248999 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.249028 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.249036 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.249077 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.249087 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.258324 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.272976 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.286494 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.298855 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.312299 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.326529 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.339346 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.350643 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.350675 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.350686 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.350700 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.350710 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.453450 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.453481 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.453496 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.453515 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.453527 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.557297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.557360 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.557378 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.557408 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.557426 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.661467 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.661559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.661575 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.661646 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.661677 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.765107 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.765170 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.765191 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.765216 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.765236 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.868825 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.868891 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.868914 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.868946 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.868967 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.971634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.971698 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.971719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.971748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.971768 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:41:59Z","lastTransitionTime":"2025-10-06T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:41:59 crc kubenswrapper[4610]: I1006 08:41:59.998744 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.011163 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.012362 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.037589 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.054007 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.067827 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.070093 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.070293 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.075026 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.075220 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.075285 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.075351 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.075429 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.083760 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.098337 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.113995 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.128858 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.140449 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.153236 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.176508 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.177838 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.177968 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.178636 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.178753 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.178875 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.187755 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.203681 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.217788 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.235856 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.255015 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.273035 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.281779 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.281853 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.281869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.281887 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.281900 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.384094 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.384138 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.384151 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.384170 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.384182 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.486504 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.486552 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.486569 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.486589 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.486598 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.589435 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.589483 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.589497 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.589515 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.589527 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.692859 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.692888 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.692896 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.692908 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.692916 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.795395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.795726 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.795856 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.795985 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.796270 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.829386 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.829675 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:32.82964386 +0000 UTC m=+84.544697288 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.898803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.898875 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.898897 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.898925 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.898950 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:00Z","lastTransitionTime":"2025-10-06T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.930622 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.930700 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.930743 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:00 crc kubenswrapper[4610]: I1006 08:42:00.930798 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.930855 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.930901 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.930944 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:32.930922949 +0000 UTC m=+84.645976337 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.930969 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:32.9309594 +0000 UTC m=+84.646013078 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931083 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931105 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931123 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931172 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:32.931152485 +0000 UTC m=+84.646205913 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931251 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931269 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931282 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:00 crc kubenswrapper[4610]: E1006 08:42:00.931321 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:32.931308679 +0000 UTC m=+84.646362097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.002145 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.002208 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.002226 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.002249 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.002268 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.070494 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.070631 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:01 crc kubenswrapper[4610]: E1006 08:42:01.070729 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.070755 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:01 crc kubenswrapper[4610]: E1006 08:42:01.070939 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:01 crc kubenswrapper[4610]: E1006 08:42:01.071170 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.105578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.105668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.105683 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.105706 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.105722 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.208556 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.208661 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.208673 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.208755 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.208796 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.312320 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.312391 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.312412 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.312439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.312459 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.416187 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.416250 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.416270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.416308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.416326 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.519647 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.519691 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.519700 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.519716 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.519731 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.624098 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.624406 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.624423 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.624517 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.624537 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.727739 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.727787 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.727803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.727861 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.727875 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.831150 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.831228 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.831250 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.831279 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.831298 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.934293 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.934341 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.934356 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.934373 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:01 crc kubenswrapper[4610]: I1006 08:42:01.934387 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:01Z","lastTransitionTime":"2025-10-06T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.037430 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.037505 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.037528 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.037565 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.037591 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.069569 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:02 crc kubenswrapper[4610]: E1006 08:42:02.069714 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.140579 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.140633 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.140644 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.140666 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.140681 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.244812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.244892 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.244912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.244944 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.244966 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.348488 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.348545 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.348562 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.348583 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.348602 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.450474 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.450512 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.450522 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.450536 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.450546 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.552497 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.552541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.552552 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.552567 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.552577 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.656278 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.656336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.656350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.656370 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.656385 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.759225 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.759587 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.759743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.759895 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.760039 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.863413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.863948 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.864221 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.864477 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.864664 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.967931 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.967994 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.968014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.968041 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:02 crc kubenswrapper[4610]: I1006 08:42:02.968085 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:02Z","lastTransitionTime":"2025-10-06T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.069489 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.069524 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:03 crc kubenswrapper[4610]: E1006 08:42:03.069718 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.069789 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:03 crc kubenswrapper[4610]: E1006 08:42:03.070734 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:03 crc kubenswrapper[4610]: E1006 08:42:03.070910 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.071281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.071325 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.071337 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.071354 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.071365 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.174869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.175220 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.175316 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.175409 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.175483 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.279335 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.279424 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.279445 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.279475 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.279495 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.383645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.383695 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.383730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.383755 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.383770 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.487607 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.487675 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.487694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.487723 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.487742 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.591855 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.591935 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.591952 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.591978 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.592001 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.699505 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.699591 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.699609 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.699639 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.699667 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.803631 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.803679 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.803690 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.803707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.803722 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.909563 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.909625 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.909646 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.909671 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:03 crc kubenswrapper[4610]: I1006 08:42:03.909690 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:03Z","lastTransitionTime":"2025-10-06T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.018812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.018860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.018870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.018887 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.018898 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.070572 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:04 crc kubenswrapper[4610]: E1006 08:42:04.070767 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.122247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.122421 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.122451 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.122486 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.122513 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.225347 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.225411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.225610 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.225634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.225650 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.328968 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.329015 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.329034 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.329087 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.329100 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.432079 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.432111 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.432119 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.432131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.432140 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.534688 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.534725 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.534735 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.534749 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.534759 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.637592 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.637669 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.637690 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.637718 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.637739 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.740233 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.740281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.740293 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.740312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.740326 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.844392 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.844450 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.844469 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.844492 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.844508 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.880774 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:04 crc kubenswrapper[4610]: E1006 08:42:04.881089 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:42:04 crc kubenswrapper[4610]: E1006 08:42:04.881229 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:42:20.881195081 +0000 UTC m=+72.596248509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.948481 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.948540 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.948551 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.948573 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:04 crc kubenswrapper[4610]: I1006 08:42:04.948592 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:04Z","lastTransitionTime":"2025-10-06T08:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.052180 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.052238 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.052261 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.052301 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.052325 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.069774 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.069843 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:05 crc kubenswrapper[4610]: E1006 08:42:05.069956 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:05 crc kubenswrapper[4610]: E1006 08:42:05.070224 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.070275 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:05 crc kubenswrapper[4610]: E1006 08:42:05.070449 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.155497 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.155765 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.155894 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.155987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.156099 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.259115 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.259171 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.259190 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.259208 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.259220 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.361776 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.362026 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.362131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.362206 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.362270 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.465389 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.465629 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.465743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.465854 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.465932 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.569127 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.569513 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.569578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.569645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.569710 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.672752 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.673081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.673221 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.673343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.673453 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.776786 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.776829 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.776841 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.776859 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.776872 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.879981 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.880028 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.880077 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.880099 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.880113 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.982674 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.982721 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.982732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.982750 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:05 crc kubenswrapper[4610]: I1006 08:42:05.982763 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:05Z","lastTransitionTime":"2025-10-06T08:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.069705 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.071093 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.084786 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.084996 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.085143 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.085237 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.085313 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.188328 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.188373 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.188384 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.188405 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.188416 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.290832 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.290885 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.290906 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.290931 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.290953 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.320194 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.320254 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.320270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.320295 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.320313 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.335649 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:06Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.340369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.340422 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.340440 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.340527 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.340564 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.360865 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:06Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.365167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.365201 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.365212 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.365229 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.365243 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.396253 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:06Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.401192 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.401270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.401282 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.401298 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.401317 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.418092 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:06Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.422599 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.422658 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.422677 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.422703 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.422724 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.436177 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:06Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:06 crc kubenswrapper[4610]: E1006 08:42:06.436352 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.438109 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.438161 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.438178 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.438202 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.438223 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.541243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.541326 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.541366 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.541408 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.541430 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.644840 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.644959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.644984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.645018 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.645068 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.747900 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.747944 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.747954 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.747972 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.747983 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.851649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.851702 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.851719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.851785 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.851805 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.954910 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.954967 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.954977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.954990 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:06 crc kubenswrapper[4610]: I1006 08:42:06.954999 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:06Z","lastTransitionTime":"2025-10-06T08:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.058114 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.058167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.058178 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.058196 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.058210 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.069842 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.069919 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:07 crc kubenswrapper[4610]: E1006 08:42:07.069978 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.070010 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:07 crc kubenswrapper[4610]: E1006 08:42:07.070148 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:07 crc kubenswrapper[4610]: E1006 08:42:07.070233 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.107317 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.124450 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.136094 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.157145 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.161250 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.161287 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.161296 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.161310 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.161320 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.174438 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.189268 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.207800 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.225418 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.241062 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.256130 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.263864 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.263898 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.263907 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.263923 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.263934 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.269686 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.285230 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.307238 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.321261 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.332493 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.347634 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.382415 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.382453 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.382461 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.382478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.382492 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.385069 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.400999 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.420469 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:07Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.485901 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.485950 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.485969 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.485999 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.486011 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.588894 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.588929 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.588939 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.588956 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.588967 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.691780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.692131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.692326 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.692471 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.692595 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.796085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.796145 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.796168 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.796193 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.796212 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.898959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.899004 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.899017 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.899036 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:07 crc kubenswrapper[4610]: I1006 08:42:07.899073 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:07Z","lastTransitionTime":"2025-10-06T08:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.001614 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.001649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.001659 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.001673 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.001682 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.070361 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:08 crc kubenswrapper[4610]: E1006 08:42:08.070474 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.103867 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.103904 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.103912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.103924 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.103933 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.206992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.207118 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.207143 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.207175 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.207199 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.310154 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.310202 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.310214 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.310231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.310247 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.412809 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.412857 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.412871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.412891 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.412907 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.515293 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.515333 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.515349 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.515371 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.515388 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.618165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.618210 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.618227 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.618252 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.618271 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.721834 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.721921 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.721934 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.721961 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.721977 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.825831 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.825894 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.825920 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.825952 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.825974 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.929635 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.929724 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.929751 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.929781 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:08 crc kubenswrapper[4610]: I1006 08:42:08.929803 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:08Z","lastTransitionTime":"2025-10-06T08:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.033030 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.033119 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.033137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.033159 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.033176 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.070286 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.070343 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.070310 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:09 crc kubenswrapper[4610]: E1006 08:42:09.070443 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:09 crc kubenswrapper[4610]: E1006 08:42:09.070526 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:09 crc kubenswrapper[4610]: E1006 08:42:09.070610 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.095612 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.123028 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.135401 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.135448 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.135458 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.135474 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.135485 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.157818 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.174849 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.187733 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.198892 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.212591 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.224980 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.237502 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.237550 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.237562 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.237590 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.237608 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.243609 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.258713 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.280189 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.296872 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.308905 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.324741 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.339348 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.339382 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.339390 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.339407 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.339417 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.344692 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.358731 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.370424 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.396231 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.442498 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.442550 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.442583 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.442604 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.442617 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.546004 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.546085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.546093 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.546114 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.546130 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.648639 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.648710 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.648734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.648763 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.648785 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.752719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.752798 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.752836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.752869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.752891 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.876018 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.876132 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.876153 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.876187 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.876204 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.979736 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.979797 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.979812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.979841 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:09 crc kubenswrapper[4610]: I1006 08:42:09.979858 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:09Z","lastTransitionTime":"2025-10-06T08:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.070038 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:10 crc kubenswrapper[4610]: E1006 08:42:10.070251 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.082940 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.082992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.083007 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.083025 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.083067 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.187395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.187530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.187550 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.187572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.187590 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.290682 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.290763 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.290781 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.290806 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.290822 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.393397 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.393475 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.393501 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.393530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.393552 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.497072 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.497137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.497150 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.497176 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.497197 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.600783 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.600860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.600880 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.600909 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.600936 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.704136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.704198 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.704213 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.704235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.704250 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.807987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.808112 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.808145 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.808176 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.808200 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.910538 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.910578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.910589 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.910606 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:10 crc kubenswrapper[4610]: I1006 08:42:10.910620 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:10Z","lastTransitionTime":"2025-10-06T08:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.031262 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.031291 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.031301 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.031316 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.031326 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.070624 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.070722 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:11 crc kubenswrapper[4610]: E1006 08:42:11.070757 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:11 crc kubenswrapper[4610]: E1006 08:42:11.071083 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.071275 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:11 crc kubenswrapper[4610]: E1006 08:42:11.071540 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.135265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.135386 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.135400 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.135422 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.135436 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.239995 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.240081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.240100 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.240127 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.240283 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.343871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.343990 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.344014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.344085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.344103 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.447448 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.447887 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.448147 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.448462 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.448605 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.552668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.552768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.552791 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.552818 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.552839 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.655858 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.655939 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.655965 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.656000 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.656030 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.759754 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.759841 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.759860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.759889 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.759909 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.862848 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.863359 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.863516 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.863664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.863805 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.965576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.965839 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.965903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.965967 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:11 crc kubenswrapper[4610]: I1006 08:42:11.966029 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:11Z","lastTransitionTime":"2025-10-06T08:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.069132 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.069435 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.069519 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.069600 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.069685 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.070088 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:12 crc kubenswrapper[4610]: E1006 08:42:12.070307 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.070331 4610 scope.go:117] "RemoveContainer" containerID="e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.172247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.172919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.172937 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.172955 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.172967 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.281478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.281952 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.282010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.282078 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.282092 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.385305 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.385359 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.385369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.385385 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.385397 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.488381 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.488412 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.488420 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.488432 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.488443 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.545434 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/1.log" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.549713 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.550802 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.570265 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.584398 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.590729 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.590794 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.590803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.590816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.590825 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.608504 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.625500 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.640946 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.658308 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.680402 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.694138 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.694204 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.694220 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.694246 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.694262 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.705328 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.725931 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.742399 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.760354 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.782187 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.796562 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.796625 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.796635 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.796656 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.796669 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.804472 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.818214 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.832162 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.846334 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.858834 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.879588 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:12Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.899836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.899869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.899882 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.899898 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:12 crc kubenswrapper[4610]: I1006 08:42:12.899910 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:12Z","lastTransitionTime":"2025-10-06T08:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.002910 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.002948 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.002960 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.002996 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.003011 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.072155 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:13 crc kubenswrapper[4610]: E1006 08:42:13.072285 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.072491 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:13 crc kubenswrapper[4610]: E1006 08:42:13.072558 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.072858 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:13 crc kubenswrapper[4610]: E1006 08:42:13.072927 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.106143 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.106198 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.106207 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.106225 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.106237 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.209697 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.209746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.209758 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.209776 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.209788 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.312411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.312446 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.312456 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.312471 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.312483 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.415611 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.415672 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.415689 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.415716 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.415736 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.518769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.518809 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.518818 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.518834 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.518848 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.555233 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/2.log" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.555944 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/1.log" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.558776 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" exitCode=1 Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.558843 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.558901 4610 scope.go:117] "RemoveContainer" containerID="e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.560581 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:42:13 crc kubenswrapper[4610]: E1006 08:42:13.560992 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.585028 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.600000 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.618551 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.622872 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.622894 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.622903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.622920 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.622933 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.634932 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.652376 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.670427 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.685024 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.710821 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e597f66f3bb06b6461e70c8b9674d7ab6a14bdce58b0836f8cb9b57571e47814\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"message\\\":\\\"min network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:41:47Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:41:47.552014 5983 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552020 5983 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1006 08:41:47.552004 5983 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSe\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.726164 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.726210 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.726223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.726247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.726262 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.727295 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.753268 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.768431 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.782961 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.797697 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.811789 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.826144 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.829511 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.829561 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.829576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.829596 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.829609 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.842702 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.859084 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.872502 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:13Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.933037 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.933126 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.933140 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.933166 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:13 crc kubenswrapper[4610]: I1006 08:42:13.933179 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:13Z","lastTransitionTime":"2025-10-06T08:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.036597 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.036657 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.036667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.036690 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.036704 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.070660 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:14 crc kubenswrapper[4610]: E1006 08:42:14.071470 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.139228 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.139274 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.139286 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.139304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.139315 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.241834 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.241905 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.241929 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.241960 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.241984 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.344139 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.344211 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.344234 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.344267 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.344502 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.447285 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.447358 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.447383 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.447419 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.447443 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.551945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.552006 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.552015 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.552114 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.552127 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.565741 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/2.log" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.569604 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:42:14 crc kubenswrapper[4610]: E1006 08:42:14.569841 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.589903 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.614865 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.629875 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.644232 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.655325 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.655387 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.655403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.655425 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.655440 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.661492 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.675783 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.690835 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.707130 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.722844 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.735399 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.758104 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.758150 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.758164 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.758181 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.758195 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.760227 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.773379 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.786362 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.801705 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.818815 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.831433 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.846787 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.858421 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:14Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.860530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.860559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.860569 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.860588 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.860601 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.963898 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.963934 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.963943 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.963961 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:14 crc kubenswrapper[4610]: I1006 08:42:14.963973 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:14Z","lastTransitionTime":"2025-10-06T08:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.066260 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.066325 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.066336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.066354 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.066364 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.069617 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.069625 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:15 crc kubenswrapper[4610]: E1006 08:42:15.069787 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:15 crc kubenswrapper[4610]: E1006 08:42:15.069900 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.069625 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:15 crc kubenswrapper[4610]: E1006 08:42:15.070040 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.168928 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.169015 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.169032 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.169074 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.169094 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.272905 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.272945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.272954 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.272971 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.272982 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.376165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.376217 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.376310 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.376336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.376352 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.478427 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.478466 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.478479 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.478500 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.478512 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.580218 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.580260 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.580272 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.580288 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.580298 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.682635 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.682669 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.682679 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.682692 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.682701 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.788324 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.788379 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.788391 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.788411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.788426 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.891231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.891264 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.891273 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.891286 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.891296 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.995993 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.996032 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.996063 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.996083 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:15 crc kubenswrapper[4610]: I1006 08:42:15.996094 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:15Z","lastTransitionTime":"2025-10-06T08:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.069621 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.069762 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.098019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.098083 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.098096 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.098110 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.098120 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.199969 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.200019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.200029 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.200076 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.200087 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.303668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.303724 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.303736 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.303757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.303770 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.406454 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.406521 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.406534 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.406561 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.406577 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.441265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.441308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.441319 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.441339 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.441351 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.455251 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:16Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.460851 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.460936 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.460949 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.460971 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.460984 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.474839 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:16Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.479714 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.479769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.479781 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.479802 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.479816 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.494567 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:16Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.499816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.499870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.499883 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.499905 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.499922 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.514146 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:16Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.517467 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.517507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.517520 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.517538 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.517551 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.530752 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:16Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:16 crc kubenswrapper[4610]: E1006 08:42:16.530967 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.532490 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.532520 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.532532 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.532570 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.532584 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.635452 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.635546 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.635566 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.635595 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.635611 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.739732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.739787 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.739817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.739836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.739845 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.843005 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.843090 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.843105 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.843146 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.843165 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.946327 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.946374 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.946387 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.946411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:16 crc kubenswrapper[4610]: I1006 08:42:16.946426 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:16Z","lastTransitionTime":"2025-10-06T08:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.049617 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.049677 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.049689 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.049714 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.049728 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.069790 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.069868 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.069911 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:17 crc kubenswrapper[4610]: E1006 08:42:17.069934 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:17 crc kubenswrapper[4610]: E1006 08:42:17.070013 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:17 crc kubenswrapper[4610]: E1006 08:42:17.070474 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.082206 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.152174 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.152251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.152270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.152316 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.152330 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.255645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.255696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.255708 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.255728 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.255743 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.358694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.358742 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.358752 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.358772 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.358785 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.461735 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.461784 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.461793 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.461814 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.461826 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.565869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.566280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.566366 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.566466 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.566546 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.669801 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.669868 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.669880 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.669904 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.669917 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.773393 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.773820 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.773923 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.774025 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.774123 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.876970 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.877366 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.877452 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.877536 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.877650 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.980478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.981108 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.981135 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.981155 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:17 crc kubenswrapper[4610]: I1006 08:42:17.981170 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:17Z","lastTransitionTime":"2025-10-06T08:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.070205 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:18 crc kubenswrapper[4610]: E1006 08:42:18.070799 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.085246 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.085304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.085317 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.085339 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.085350 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.188864 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.188942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.188959 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.189189 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.189209 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.292769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.292816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.292829 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.292847 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.292859 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.395309 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.396056 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.396144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.396251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.396346 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.499600 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.499658 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.499668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.499694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.499708 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.603411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.603465 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.603477 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.603502 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.603514 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.706265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.706452 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.706466 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.706487 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.706500 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.810244 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.810305 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.810321 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.810345 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.810359 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.914996 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.915584 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.915595 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.915617 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:18 crc kubenswrapper[4610]: I1006 08:42:18.915628 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:18Z","lastTransitionTime":"2025-10-06T08:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.018450 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.018519 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.018531 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.018551 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.018563 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.069525 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.069581 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.069525 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:19 crc kubenswrapper[4610]: E1006 08:42:19.069716 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:19 crc kubenswrapper[4610]: E1006 08:42:19.069808 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:19 crc kubenswrapper[4610]: E1006 08:42:19.069879 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.083462 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.102819 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.121920 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.121968 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.121977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.121998 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.122009 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.122512 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.139802 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.163756 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.179969 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.202505 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.219429 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.225730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.225782 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.225797 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.225830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.225851 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.234928 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.253520 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.271322 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.287977 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.303344 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.320069 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50757436-075a-49bc-8602-3d85917d778e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.329097 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.329134 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.329144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.329161 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.329172 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.341788 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.368290 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.398820 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.416441 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.433972 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.434010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.434021 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.434054 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.434066 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.436945 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.537548 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.537598 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.537610 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.537632 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.537646 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.640916 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.640981 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.640996 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.641020 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.641036 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.744509 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.744593 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.744615 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.744646 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.744669 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.847161 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.847210 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.847219 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.847239 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.847251 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.951232 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.951304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.951323 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.951351 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:19 crc kubenswrapper[4610]: I1006 08:42:19.951371 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:19Z","lastTransitionTime":"2025-10-06T08:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.053664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.053698 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.053706 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.053719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.053727 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.069750 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:20 crc kubenswrapper[4610]: E1006 08:42:20.069956 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.156296 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.156574 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.156695 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.156828 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.156953 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.259134 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.259172 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.259181 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.259196 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.259207 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.362439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.362852 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.363009 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.363336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.363439 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.466251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.466295 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.466307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.466328 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.466342 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.568876 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.569355 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.569509 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.569616 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.569711 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.672692 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.672746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.672759 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.672780 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.672793 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.774926 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.774986 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.774996 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.775017 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.775028 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.877954 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.878374 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.878439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.878525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.878602 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.976154 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:20 crc kubenswrapper[4610]: E1006 08:42:20.976412 4610 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:42:20 crc kubenswrapper[4610]: E1006 08:42:20.976544 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs podName:a62060d4-5efa-4c4f-851d-8738476f690e nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.976512214 +0000 UTC m=+104.691565602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs") pod "network-metrics-daemon-46wzl" (UID: "a62060d4-5efa-4c4f-851d-8738476f690e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.982122 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.982165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.982182 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.982211 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:20 crc kubenswrapper[4610]: I1006 08:42:20.982225 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:20Z","lastTransitionTime":"2025-10-06T08:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.070552 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.070611 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.070664 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:21 crc kubenswrapper[4610]: E1006 08:42:21.070791 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:21 crc kubenswrapper[4610]: E1006 08:42:21.070949 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:21 crc kubenswrapper[4610]: E1006 08:42:21.071067 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.085528 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.085565 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.085575 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.085595 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.085610 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.189144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.189202 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.189214 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.189235 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.189247 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.292964 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.293041 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.293091 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.293121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.293143 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.395774 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.395869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.395883 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.395899 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.395910 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.498873 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.498926 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.498942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.498962 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.498974 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.601155 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.601202 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.601213 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.601230 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.601241 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.704541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.704615 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.704634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.704663 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.704681 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.807860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.807928 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.807942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.807962 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.807978 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.910765 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.910817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.910830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.910851 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:21 crc kubenswrapper[4610]: I1006 08:42:21.910864 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:21Z","lastTransitionTime":"2025-10-06T08:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.014205 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.014271 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.014290 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.014323 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.014344 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.069576 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:22 crc kubenswrapper[4610]: E1006 08:42:22.069909 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.117209 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.117260 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.117273 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.117295 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.117311 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.219956 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.219992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.220000 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.220021 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.220032 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.323679 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.323722 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.323734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.323755 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.323769 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.426626 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.426666 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.426681 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.426698 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.426709 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.528611 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.528668 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.528683 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.528704 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.528721 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.632036 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.632123 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.632136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.632153 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.632165 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.735074 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.735130 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.735141 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.735164 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.735176 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.838946 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.839445 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.839568 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.839720 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.839847 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.943167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.943219 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.943231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.943252 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:22 crc kubenswrapper[4610]: I1006 08:42:22.943266 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:22Z","lastTransitionTime":"2025-10-06T08:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.047409 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.047491 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.047504 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.047525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.047537 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.069971 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.070034 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:23 crc kubenswrapper[4610]: E1006 08:42:23.070221 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.070275 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:23 crc kubenswrapper[4610]: E1006 08:42:23.070421 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:23 crc kubenswrapper[4610]: E1006 08:42:23.070487 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.150860 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.151377 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.151478 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.151584 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.151695 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.254841 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.254910 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.254924 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.254952 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.254965 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.357920 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.357974 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.357986 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.358007 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.358024 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.461038 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.461111 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.461122 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.461149 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.461162 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.565336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.566138 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.566215 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.566305 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.566384 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.610070 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdc9x_03a2c34b-edd9-489b-a8e6-23502cdeb309/kube-multus/0.log" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.610489 4610 generic.go:334] "Generic (PLEG): container finished" podID="03a2c34b-edd9-489b-a8e6-23502cdeb309" containerID="35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91" exitCode=1 Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.610599 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerDied","Data":"35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.611547 4610 scope.go:117] "RemoveContainer" containerID="35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.625987 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.643302 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50757436-075a-49bc-8602-3d85917d778e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.669616 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.669664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.669675 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.669697 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.669712 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.671423 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.691052 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.707840 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.725998 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:22Z\\\",\\\"message\\\":\\\"2025-10-06T08:41:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b\\\\n2025-10-06T08:41:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b to /host/opt/cni/bin/\\\\n2025-10-06T08:41:37Z [verbose] multus-daemon started\\\\n2025-10-06T08:41:37Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:42:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.744976 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.758458 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.772485 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.772524 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.772538 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.772559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.772576 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.773706 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.784278 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.793782 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.806384 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.818719 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.830374 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.841658 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.860680 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876252 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876325 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876372 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.876572 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.898584 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.914322 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.979804 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.979844 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.979879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.979899 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:23 crc kubenswrapper[4610]: I1006 08:42:23.979910 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:23Z","lastTransitionTime":"2025-10-06T08:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.069975 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:24 crc kubenswrapper[4610]: E1006 08:42:24.070119 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.082699 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.082724 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.082735 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.082748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.082762 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.185329 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.185369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.185384 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.185401 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.185411 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.287738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.287808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.287819 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.287836 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.287847 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.389748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.389787 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.389795 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.389829 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.389839 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.492185 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.492255 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.492272 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.492296 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.492311 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.594566 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.594599 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.594609 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.594624 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.594634 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.614677 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdc9x_03a2c34b-edd9-489b-a8e6-23502cdeb309/kube-multus/0.log" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.614722 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerStarted","Data":"8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.634035 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.649607 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50757436-075a-49bc-8602-3d85917d778e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.668869 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.685899 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703512 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703590 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703602 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703587 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703639 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.703842 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.723490 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:22Z\\\",\\\"message\\\":\\\"2025-10-06T08:41:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b\\\\n2025-10-06T08:41:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b to /host/opt/cni/bin/\\\\n2025-10-06T08:41:37Z [verbose] multus-daemon started\\\\n2025-10-06T08:41:37Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:42:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.737473 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.751688 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.765066 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.784783 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.802715 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.808071 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.808116 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.808127 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.808148 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.808164 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.828136 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.842197 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.857915 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.874217 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.888255 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.909492 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.911530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.911579 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.911593 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.911613 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.911627 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:24Z","lastTransitionTime":"2025-10-06T08:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.926114 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:24 crc kubenswrapper[4610]: I1006 08:42:24.938773 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.013674 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.013720 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.013729 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.013744 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.013754 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.069809 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:25 crc kubenswrapper[4610]: E1006 08:42:25.069928 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.069977 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.069818 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:25 crc kubenswrapper[4610]: E1006 08:42:25.070139 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:25 crc kubenswrapper[4610]: E1006 08:42:25.070163 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.115691 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.115736 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.115748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.115764 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.115775 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.218737 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.218795 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.218810 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.218829 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.218846 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.321842 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.321885 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.321899 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.321917 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.321929 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.424824 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.424870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.424887 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.424911 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.424928 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.526909 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.526947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.526958 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.526972 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.526983 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.629018 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.629082 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.629094 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.629109 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.629119 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.731011 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.731066 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.731078 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.731093 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.731103 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.834088 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.834123 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.834131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.834144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:25 crc kubenswrapper[4610]: I1006 08:42:25.834152 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:25Z","lastTransitionTime":"2025-10-06T08:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.004673 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.004728 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.004737 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.004757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.004769 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.070081 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.070201 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.106842 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.106906 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.106930 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.106960 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.106984 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.210576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.210642 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.210682 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.210715 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.210738 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.313677 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.313714 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.313725 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.313740 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.313751 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.416267 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.416330 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.416349 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.416375 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.416390 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.519395 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.519470 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.519493 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.519523 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.519545 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.621733 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.621766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.621777 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.621793 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.621803 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.663403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.663440 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.663451 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.663466 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.663478 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.677453 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.681400 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.681422 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.681432 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.681444 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.681453 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.697557 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.701349 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.701386 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.701397 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.701413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.701425 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.715974 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.719886 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.719947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.719974 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.720002 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.720025 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.735593 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.739684 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.739775 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.739809 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.739840 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.739864 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.755368 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:26 crc kubenswrapper[4610]: E1006 08:42:26.755551 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.757852 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.757919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.757942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.757971 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.757995 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.860340 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.860421 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.860435 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.860454 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.860466 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.963596 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.963642 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.963652 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.963674 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:26 crc kubenswrapper[4610]: I1006 08:42:26.963687 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:26Z","lastTransitionTime":"2025-10-06T08:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.066083 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.066142 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.066154 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.066174 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.066188 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.069595 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.069669 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.069688 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:27 crc kubenswrapper[4610]: E1006 08:42:27.069752 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:27 crc kubenswrapper[4610]: E1006 08:42:27.069850 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:27 crc kubenswrapper[4610]: E1006 08:42:27.069941 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.169074 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.169137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.169149 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.169173 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.169189 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.272707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.272766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.272810 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.272835 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.272850 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.376971 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.377099 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.377128 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.377161 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.377183 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.481390 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.481471 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.481497 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.481531 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.481551 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.584871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.584918 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.584928 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.584945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.584956 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.688572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.688649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.688663 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.688689 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.688705 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.790909 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.790957 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.790973 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.790992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.791009 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.894194 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.894243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.894254 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.894273 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.894285 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.996561 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.996618 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.996641 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.996669 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:27 crc kubenswrapper[4610]: I1006 08:42:27.996691 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:27Z","lastTransitionTime":"2025-10-06T08:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.070113 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:28 crc kubenswrapper[4610]: E1006 08:42:28.070539 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.070651 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:42:28 crc kubenswrapper[4610]: E1006 08:42:28.070825 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.100343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.100372 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.100380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.100394 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.100404 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.203728 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.203846 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.203871 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.204105 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.204163 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.307680 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.307734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.307746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.307769 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.307783 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.409676 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.409721 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.409732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.409750 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.409761 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.512280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.512320 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.512334 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.512351 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.512363 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.615536 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.615589 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.615605 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.615650 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.615663 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.718456 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.718541 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.718565 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.718597 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.718620 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.820749 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.820787 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.820802 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.820820 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.820837 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.922911 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.922950 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.922960 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.922978 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:28 crc kubenswrapper[4610]: I1006 08:42:28.922997 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:28Z","lastTransitionTime":"2025-10-06T08:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.025694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.025730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.025738 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.025750 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.025759 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.070395 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.070494 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:29 crc kubenswrapper[4610]: E1006 08:42:29.070648 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.070843 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:29 crc kubenswrapper[4610]: E1006 08:42:29.071073 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:29 crc kubenswrapper[4610]: E1006 08:42:29.071183 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.088087 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.101716 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50757436-075a-49bc-8602-3d85917d778e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.123270 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.132325 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.132360 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.132369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.132383 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.132396 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.136414 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.148510 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.183250 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:22Z\\\",\\\"message\\\":\\\"2025-10-06T08:41:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b\\\\n2025-10-06T08:41:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b to /host/opt/cni/bin/\\\\n2025-10-06T08:41:37Z [verbose] multus-daemon started\\\\n2025-10-06T08:41:37Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:42:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.194919 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.204814 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.219582 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.242817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.242869 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.242883 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.242903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.242914 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.245898 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.255192 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.264403 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.277818 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.289482 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.299278 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.314352 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.325519 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.345813 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.345950 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.346008 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.346156 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.346191 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.354572 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.365345 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.448512 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.448625 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.448645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.448693 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.448709 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.551696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.551794 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.551808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.551833 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.551849 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.654544 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.654631 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.654651 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.654683 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.654704 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.757854 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.757915 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.757925 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.757942 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.757953 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.860651 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.860716 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.860730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.860755 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.860773 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.963327 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.963377 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.963385 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.963398 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:29 crc kubenswrapper[4610]: I1006 08:42:29.963408 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:29Z","lastTransitionTime":"2025-10-06T08:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.065633 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.065684 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.065694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.065707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.065716 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.069951 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:30 crc kubenswrapper[4610]: E1006 08:42:30.070098 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.168249 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.168284 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.168292 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.168307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.168317 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.271350 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.271394 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.271404 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.271419 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.271431 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.373614 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.373643 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.373651 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.373664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.373674 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.476754 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.476795 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.476806 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.476823 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.476836 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.581333 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.581360 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.581368 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.581380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.581389 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.683762 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.683791 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.683817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.683830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.683838 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.787340 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.787399 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.787416 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.787439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.787487 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.891599 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.891640 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.891653 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.891671 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.891683 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.994933 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.994992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.995015 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.995087 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:30 crc kubenswrapper[4610]: I1006 08:42:30.995120 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:30Z","lastTransitionTime":"2025-10-06T08:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.070811 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:31 crc kubenswrapper[4610]: E1006 08:42:31.071750 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.071879 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.071977 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:31 crc kubenswrapper[4610]: E1006 08:42:31.072104 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:31 crc kubenswrapper[4610]: E1006 08:42:31.072223 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.097513 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.097566 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.097578 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.097598 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.097613 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.201702 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.201783 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.201808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.201844 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.201867 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.304443 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.304509 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.304530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.304559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.304576 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.407233 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.407312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.407326 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.407346 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.407358 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.510188 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.510279 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.510302 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.510331 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.510348 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.612576 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.612632 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.612644 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.612664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.612676 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.716530 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.716563 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.716571 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.716585 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.716595 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.819105 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.819140 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.819150 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.819165 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.819178 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.921757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.921816 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.921827 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.921848 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:31 crc kubenswrapper[4610]: I1006 08:42:31.921863 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:31Z","lastTransitionTime":"2025-10-06T08:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.024879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.024944 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.024955 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.024975 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.024987 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.070318 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:32 crc kubenswrapper[4610]: E1006 08:42:32.070511 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.128317 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.128722 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.129086 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.129312 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.129401 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.233408 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.233768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.233977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.234182 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.234334 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.338124 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.338631 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.338760 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.338903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.339083 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.443189 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.443609 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.443703 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.443790 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.443877 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.547186 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.547230 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.547239 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.547259 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.547273 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.649834 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.649899 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.649911 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.649934 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.649953 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.752852 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.752914 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.752926 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.752945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.752961 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.855203 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.855271 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.855282 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.855307 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.855319 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.924183 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:32 crc kubenswrapper[4610]: E1006 08:42:32.924389 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:36.924350573 +0000 UTC m=+148.639404001 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.959238 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.959280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.959298 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.959326 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:32 crc kubenswrapper[4610]: I1006 08:42:32.959345 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:32Z","lastTransitionTime":"2025-10-06T08:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.026162 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.026487 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.026616 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026327 4610 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026760 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.026724 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026786 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026995 4610 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026823 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:43:37.026801561 +0000 UTC m=+148.741854959 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027084 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:43:37.027064958 +0000 UTC m=+148.742118356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.026667 4610 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027136 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:43:37.02712351 +0000 UTC m=+148.742177108 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027397 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027494 4610 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027568 4610 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.027703 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:43:37.027685116 +0000 UTC m=+148.742738704 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.063130 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.063194 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.063215 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.063246 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.063278 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.069592 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.069594 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.069824 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.070031 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.070368 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:33 crc kubenswrapper[4610]: E1006 08:42:33.070593 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.165413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.165479 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.165494 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.165612 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.165634 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.267798 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.268071 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.268162 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.268308 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.268402 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.371091 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.371174 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.371198 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.371226 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.371244 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.473301 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.473369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.473380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.473403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.473417 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.577444 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.577507 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.577519 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.577544 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.577557 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.680557 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.680923 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.680990 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.681086 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.681152 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.784315 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.784712 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.784904 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.785085 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.785243 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.892460 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.892508 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.892519 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.892537 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.892549 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.996889 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.996941 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.996962 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.996987 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:33 crc kubenswrapper[4610]: I1006 08:42:33.997005 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:33Z","lastTransitionTime":"2025-10-06T08:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.070031 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:34 crc kubenswrapper[4610]: E1006 08:42:34.070413 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.100344 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.100426 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.100447 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.100473 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.100490 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.203502 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.203793 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.203890 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.203988 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.204178 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.306297 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.306551 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.306610 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.306675 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.306742 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.409602 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.409665 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.409683 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.409708 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.409725 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.512142 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.512198 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.512223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.512254 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.512275 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.615248 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.615292 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.615305 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.615321 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.615333 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.718603 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.719170 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.719572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.720016 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.720214 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.823202 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.823258 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.823272 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.823293 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.823305 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.926692 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.926734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.926743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.926762 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:34 crc kubenswrapper[4610]: I1006 08:42:34.926771 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:34Z","lastTransitionTime":"2025-10-06T08:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.030363 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.030411 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.030423 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.030441 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.030456 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.070207 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.070245 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.070330 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:35 crc kubenswrapper[4610]: E1006 08:42:35.070390 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:35 crc kubenswrapper[4610]: E1006 08:42:35.070555 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:35 crc kubenswrapper[4610]: E1006 08:42:35.070632 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.134559 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.134619 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.134634 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.134654 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.134667 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.237343 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.237417 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.237438 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.237470 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.237491 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.339981 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.340014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.340027 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.340072 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.340091 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.443120 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.443222 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.443242 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.443304 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.443322 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.546480 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.546525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.546537 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.546554 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.546565 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.649649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.649727 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.649747 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.649774 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.649790 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.753605 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.753890 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.753953 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.754018 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.754116 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.856751 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.856793 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.856802 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.856815 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.856824 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.959947 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.959984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.959992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.960010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:35 crc kubenswrapper[4610]: I1006 08:42:35.960019 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:35Z","lastTransitionTime":"2025-10-06T08:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.062570 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.062624 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.062644 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.062664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.062681 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.069889 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.070006 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.172433 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.172688 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.172918 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.173057 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.173151 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.276614 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.276664 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.276676 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.276693 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.276704 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.379732 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.380168 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.380233 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.380320 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.380385 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.483788 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.483833 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.483850 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.483873 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.483889 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.586813 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.587265 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.587336 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.587463 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.587534 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.689934 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.689997 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.690015 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.690068 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.690087 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.792574 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.792975 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.793189 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.793223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.793238 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.895667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.895716 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.895730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.895746 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.895759 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.896645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.896682 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.896693 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.896708 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.896718 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.908665 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.913626 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.913667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.913677 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.913716 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.913729 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.926803 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.930604 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.930667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.930688 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.930717 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.930739 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.952819 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.956439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.956502 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.956525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.956553 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.956577 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.969523 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.974136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.974194 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.974207 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.974243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.974256 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.989700 4610 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ca67adee-388a-4a79-b348-5f88a51a6438\\\",\\\"systemUUID\\\":\\\"a268cadd-0c3c-491c-869f-df56a4b697a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:36 crc kubenswrapper[4610]: E1006 08:42:36.989930 4610 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.998364 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.998434 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.998445 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.998483 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:36 crc kubenswrapper[4610]: I1006 08:42:36.998497 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:36Z","lastTransitionTime":"2025-10-06T08:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.071270 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.071311 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.071270 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:37 crc kubenswrapper[4610]: E1006 08:42:37.071412 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:37 crc kubenswrapper[4610]: E1006 08:42:37.071478 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:37 crc kubenswrapper[4610]: E1006 08:42:37.071531 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.101449 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.101501 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.101513 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.101531 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.101544 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.204388 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.204439 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.204450 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.204468 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.204481 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.307946 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.307997 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.308012 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.308030 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.308058 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.410729 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.410813 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.410830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.410849 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.410862 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.514038 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.514137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.514159 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.514185 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.514203 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.617864 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.617921 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.617936 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.617951 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.617963 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.720776 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.720817 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.720828 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.720843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.720853 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.823701 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.823766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.823783 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.823808 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.823825 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.928100 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.928185 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.928220 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.928249 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:37 crc kubenswrapper[4610]: I1006 08:42:37.928269 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:37Z","lastTransitionTime":"2025-10-06T08:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.030500 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.030532 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.030540 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.030554 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.030563 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.070444 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:38 crc kubenswrapper[4610]: E1006 08:42:38.071433 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.135373 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.135913 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.135927 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.135949 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.135960 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.239228 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.239270 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.239280 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.239300 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.239313 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.443247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.443309 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.443320 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.443341 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.443353 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.546171 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.546225 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.546237 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.546257 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.546270 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.648629 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.648666 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.648675 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.648694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.648705 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.751768 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.752328 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.752339 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.752360 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.752370 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.854758 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.854804 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.854820 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.854839 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.854850 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.957627 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.957697 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.957717 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.957743 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:38 crc kubenswrapper[4610]: I1006 08:42:38.957762 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:38Z","lastTransitionTime":"2025-10-06T08:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.061167 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.061229 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.061251 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.061278 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.061298 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.069781 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.069855 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:39 crc kubenswrapper[4610]: E1006 08:42:39.069991 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.070274 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:39 crc kubenswrapper[4610]: E1006 08:42:39.070519 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:39 crc kubenswrapper[4610]: E1006 08:42:39.070665 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.094600 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.116370 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3879ac0b86915b2306d130e6f9df9964b50ee2a47782605de74e3f11ff360729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.134462 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://637220b5f5fa5c44cd16c7a96d1924484efefb7b74335244a1e494b285ffee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://943e76fa1d69be621363d3f7681e8db0ae90be589045f019c63cb2d39a24570b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.146670 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kf48m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1deae4d-39c0-4684-8851-d2e6da166a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0737c56cd51111deee3e1e8769858d765bf2418d185c696fb83951df5a9fd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bp96d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kf48m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.161879 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a19d05-9838-4c7d-aa2c-e778a2ef0148\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe5e740daee7868b1b4657b5767c9482e7b274d194608de7e1866a2f96b75dcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvk2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6w5xr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.163681 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.163720 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.163730 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.163751 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.163765 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.186388 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980266ef-4c63-4532-8b33-25fa1c57a9a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:13Z\\\",\\\"message\\\":\\\"c7f79c-55gtf\\\\nI1006 08:42:13.163664 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163670 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163675 6296 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj in node crc\\\\nI1006 08:42:13.163679 6296 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-pqkpj after 0 failed attempt(s)\\\\nI1006 08:42:13.163684 6296 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-pqkpj\\\\nI1006 08:42:13.163692 6296 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163698 6296 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-kf48m\\\\nI1006 08:42:13.163702 6296 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-kf48m in node crc\\\\nF1006 08:42:13.163704 6296 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pqkpj_openshift-ovn-kubernetes(980266ef-4c63-4532-8b33-25fa1c57a9a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fv5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pqkpj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.201924 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb2ae6e-2e9c-49b3-a7b3-2d9037d563d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://675fd161dc01d2426852e215a079616ed3edb687f3745f573e6407ef74d8be63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fb26201076143592c580adae78cb52a314b7550db8adf8a237e5f4550709bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9nxj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpbkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.216696 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-46wzl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62060d4-5efa-4c4f-851d-8738476f690e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wl7kk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-46wzl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.233511 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0529c71-e83a-40b9-8bef-a216fc1da3af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6854eb9563aeee37c121a7ee0f732e3683fb5f9167b1249fea6b9ca569272ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://536ba3af1f8efca79ac193a56f682eb884347220267dc5428208ab562f20f109\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e409b26ffe6aeaa87ab5f27efc8b9a9c1fc4ea995edc19ccf098100e91b78e34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://370ac51ad2a9a593d6449bd84a18a4d15a87b69f099e98b5e76e850a0217c433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b43d1909f0a4c1da03e883335f2f58adbfa35783c1c1f04e1bc25a190a4728\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:41:29Z\\\",\\\"message\\\":\\\"W1006 08:41:14.383242 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:41:14.384019 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759740074 cert, and key in /tmp/serving-cert-3069563372/serving-signer.crt, /tmp/serving-cert-3069563372/serving-signer.key\\\\nI1006 08:41:14.951119 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:41:28.873141 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:41:28.873252 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:41:28.874259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3069563372/tls.crt::/tmp/serving-cert-3069563372/tls.key\\\\\\\"\\\\nI1006 08:41:29.055523 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:41:29.059688 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:41:29.059723 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:41:29.059756 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:41:29.059765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1006 08:41:29.071757 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1cf9893b57301c8b62f8ace1d9a17970bca1c90e1463952fee038aab40d1dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4465867e951ca72bae76a929855a05c79cc8246b104dba0b69eaab8f6def120b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.247020 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.258850 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85baa75e53db8cecc2681e01482111d087b99eb363a5befcd7335779fa7cb491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.266907 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.266981 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.266991 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.267014 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.267027 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.279222 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d58xp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ba0f3b-6a75-44b9-b9ca-75c81656eb4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64a782ac856f8c98832ee3bcbf299bc0b52252e63193f28e59dbef390b32394c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6889dfbc1a9c40d11e34c2162d7da8e18c832565090fceabd62408dc7b4a293c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4430cbaa5f8fba262bcc05eabb1386783ff66a10f322cb1f2a1b7514636c2bf5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88bcb9201b20ea2d0db716d236badf47b5a0b8e9e3cc99332ebbffc4b952ed8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8c86305623f2651a0c9716ba9c9219a55e67b3204269e0807c2aa62d27ba54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ca0b1feab116f48dccbcbf644e00c0d925a5c9a07575a0b0e4eb3dde783b1f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://972f962c1e1b9c3af5e6511f308c99c1a6cf7fb95d929b7e90c74df515956c45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbgrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d58xp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.296753 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1937f9fe-9624-486d-8dec-b8ab9654ec95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321cb49744fca487af3902e62376cb5230f82aa1086b24709f9abb5dbae156c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c14fd1c89e3795243525667bb96f639f64a5dfa44536f05522639ce0040820d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b6fe422446cc381277902789ea6cfd8f4e19bd4c6d94b6a2270cff1694960b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b559c70e60f2f54f3cdf1280b238d0c34dd1c2997582251225ed42eb63eb34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.309327 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50757436-075a-49bc-8602-3d85917d778e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5c6c898928adc83582dd1007b11f63bf3d013f8a53a6b3f9700c7d8de18275\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c5627849f8d626aade6c2f4e75c7cca788e4ac1b78c51ef18f2c341940bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.321169 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8tw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d96b4ed-90ec-4ea5-b244-55c3b8f55def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ab5e9ac7e7e87c6bea1e0b24e41bb10bf67dac2fbb94beeccda59283a8783d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7m8vw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8tw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.351980 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"295ab7b1-5165-4732-aa27-84b25801662c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://085185ec11f609e2738f92d69769ede3b6bfcb3f814baa37b9c034f4baaadd40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8259678bb73c79bb3cc710c325674b37136772ec8ee14a0771ced7df53254907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2687299d21df4d2aad5c987b5b3b40a6e4609d8ae04dac4d588264b2f463b8a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa792bd8d17ecd5903f3df0524046fae0941ed2e74212056d5e60b24fb7c93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20926a87998636fc9306c4fd1fa56a6c651442ec9993bd3b353b22ef7b8c498d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a641558f84e1fde240026ea572096c19fecab6733f9a137f2003003d67dcae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6db8ce5d4037d06176a4564896f30d6fd2126669a2b2d198ad759f85b16d8a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07ec7632b9fc5704bcca092dfce3c84909f91bcca5505aeeabbd68c45b3f8b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.369334 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.369383 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.369394 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.369413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.369424 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.374729 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c112c54-7763-4337-86d1-56424d7c684a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://541d8a53b56e85b41f5d131445f28cb7809f4cda135cea1de8c2b671ff4bb089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6aa0f60d81db21b3d59ef497aad05d698ed36afa1b908e519810a9205f82f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17f2adc2e15f48f3d008da9eb1639e4b1b88f92e5b5d834b5c1e2159cb6682b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac3900bccacd5768e3df14742bb704622a602a419964b442f30be62040fe7e5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.391631 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.406360 4610 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kdc9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03a2c34b-edd9-489b-a8e6-23502cdeb309\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:42:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:42:22Z\\\",\\\"message\\\":\\\"2025-10-06T08:41:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b\\\\n2025-10-06T08:41:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_034dda05-4ea3-4ab6-bd44-7a6a3f16395b to /host/opt/cni/bin/\\\\n2025-10-06T08:41:37Z [verbose] multus-daemon started\\\\n2025-10-06T08:41:37Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:42:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:41:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q747\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:41:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kdc9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:42:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.471655 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.471694 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.471706 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.471725 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.471735 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.574137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.574204 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.574223 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.574247 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.574265 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.676463 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.676806 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.676992 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.677226 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.677355 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.780451 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.780877 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.781041 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.781238 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.781377 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.884123 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.884523 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.884735 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.884949 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.885183 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.989626 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.990385 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.990653 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.990903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:39 crc kubenswrapper[4610]: I1006 08:42:39.991112 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:39Z","lastTransitionTime":"2025-10-06T08:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.070470 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:40 crc kubenswrapper[4610]: E1006 08:42:40.071478 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.094459 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.094506 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.094517 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.094537 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.094551 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.198281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.198334 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.198353 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.198380 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.198392 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.301281 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.301332 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.301342 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.301358 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.301367 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.404726 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.405153 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.405405 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.405667 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.405866 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.508719 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.508765 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.508777 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.508794 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.508808 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.610813 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.611323 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.611498 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.611690 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.611868 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.714766 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.714803 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.714812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.714826 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.714835 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.817826 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.817937 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.817961 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.818021 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.818100 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.922255 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.922326 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.922345 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.922369 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:40 crc kubenswrapper[4610]: I1006 08:42:40.922386 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:40Z","lastTransitionTime":"2025-10-06T08:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.025303 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.025371 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.025393 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.025421 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.025444 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.070445 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:41 crc kubenswrapper[4610]: E1006 08:42:41.070595 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.070471 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.070467 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:41 crc kubenswrapper[4610]: E1006 08:42:41.070688 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:41 crc kubenswrapper[4610]: E1006 08:42:41.070862 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.127608 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.127676 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.127693 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.127720 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.127741 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.230186 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.230243 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.230253 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.230267 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.230276 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.332314 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.332362 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.332383 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.332402 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.332413 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.434877 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.434919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.434930 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.434945 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.434955 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.536870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.536903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.536912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.536924 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.536933 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.638785 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.638819 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.638831 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.638846 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.638857 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.741603 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.741912 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.742010 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.742129 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.742229 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.845505 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.845562 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.845577 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.845594 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.845604 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.948231 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.948300 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.948310 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.948324 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:41 crc kubenswrapper[4610]: I1006 08:42:41.948365 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:41Z","lastTransitionTime":"2025-10-06T08:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.051677 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.051722 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.051734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.051759 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.051772 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.070566 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:42 crc kubenswrapper[4610]: E1006 08:42:42.070725 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.071651 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.154024 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.154136 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.154163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.154193 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.154217 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.256707 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.256727 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.256734 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.256748 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.256757 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.358843 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.358889 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.358903 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.358919 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.358929 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.460703 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.460733 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.460744 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.460757 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.460767 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.563207 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.563248 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.563256 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.563271 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.563280 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.665830 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.665868 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.665879 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.665895 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.665908 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.674498 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/2.log" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.677481 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerStarted","Data":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.677907 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.709843 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.709823027 podStartE2EDuration="1m10.709823027s" podCreationTimestamp="2025-10-06 08:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.708459048 +0000 UTC m=+94.423512446" watchObservedRunningTime="2025-10-06 08:42:42.709823027 +0000 UTC m=+94.424876415" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.736524 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.736499027 podStartE2EDuration="1m11.736499027s" podCreationTimestamp="2025-10-06 08:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.723638671 +0000 UTC m=+94.438692069" watchObservedRunningTime="2025-10-06 08:42:42.736499027 +0000 UTC m=+94.451552425" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.763388 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kdc9x" podStartSLOduration=68.763364942 podStartE2EDuration="1m8.763364942s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.751077162 +0000 UTC m=+94.466130560" watchObservedRunningTime="2025-10-06 08:42:42.763364942 +0000 UTC m=+94.478418340" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.768812 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.768853 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.768870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.768890 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.768905 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.778083 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v8tw6" podStartSLOduration=68.778022239 podStartE2EDuration="1m8.778022239s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.763959379 +0000 UTC m=+94.479012797" watchObservedRunningTime="2025-10-06 08:42:42.778022239 +0000 UTC m=+94.493075647" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.870814 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.870877 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.870890 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.870932 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.870946 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.890392 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kf48m" podStartSLOduration=68.890353968 podStartE2EDuration="1m8.890353968s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.87074655 +0000 UTC m=+94.585799948" watchObservedRunningTime="2025-10-06 08:42:42.890353968 +0000 UTC m=+94.605407356" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.913638 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.913620981 podStartE2EDuration="1m13.913620981s" podCreationTimestamp="2025-10-06 08:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.91326583 +0000 UTC m=+94.628319218" watchObservedRunningTime="2025-10-06 08:42:42.913620981 +0000 UTC m=+94.628674359" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.926630 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-46wzl"] Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.927020 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:42 crc kubenswrapper[4610]: E1006 08:42:42.927282 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.968763 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d58xp" podStartSLOduration=68.96873086 podStartE2EDuration="1m8.96873086s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.968209185 +0000 UTC m=+94.683262593" watchObservedRunningTime="2025-10-06 08:42:42.96873086 +0000 UTC m=+94.683784258" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.973824 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.973862 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.973870 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.973886 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.973897 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:42Z","lastTransitionTime":"2025-10-06T08:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:42 crc kubenswrapper[4610]: I1006 08:42:42.984487 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podStartSLOduration=68.984469388 podStartE2EDuration="1m8.984469388s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:42.983263244 +0000 UTC m=+94.698316652" watchObservedRunningTime="2025-10-06 08:42:42.984469388 +0000 UTC m=+94.699522776" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.014156 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podStartSLOduration=69.014131153 podStartE2EDuration="1m9.014131153s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:43.01299281 +0000 UTC m=+94.728046208" watchObservedRunningTime="2025-10-06 08:42:43.014131153 +0000 UTC m=+94.729184541" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.024548 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpbkh" podStartSLOduration=69.024526269 podStartE2EDuration="1m9.024526269s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:43.023728156 +0000 UTC m=+94.738781544" watchObservedRunningTime="2025-10-06 08:42:43.024526269 +0000 UTC m=+94.739579657" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.038782 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.038760604 podStartE2EDuration="43.038760604s" podCreationTimestamp="2025-10-06 08:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:43.03826598 +0000 UTC m=+94.753319368" watchObservedRunningTime="2025-10-06 08:42:43.038760604 +0000 UTC m=+94.753813992" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.064825 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.064806706 podStartE2EDuration="26.064806706s" podCreationTimestamp="2025-10-06 08:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:43.064769425 +0000 UTC m=+94.779822813" watchObservedRunningTime="2025-10-06 08:42:43.064806706 +0000 UTC m=+94.779860094" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.069913 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.069948 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:43 crc kubenswrapper[4610]: E1006 08:42:43.070069 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:43 crc kubenswrapper[4610]: E1006 08:42:43.070157 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.075317 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.075362 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.075376 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.075392 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.075403 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.178219 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.178262 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.178273 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.178289 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.178301 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.281076 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.281125 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.281137 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.281158 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.281171 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.384131 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.384173 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.384183 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.384199 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.384211 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.487013 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.487061 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.487070 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.487081 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.487092 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.589915 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.589965 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.589977 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.589997 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.590011 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.692509 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.692546 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.692556 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.692572 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.692584 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.795197 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.795237 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.795250 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.795268 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.795282 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.897844 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.897889 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.897901 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.897917 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:43 crc kubenswrapper[4610]: I1006 08:42:43.897930 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:43Z","lastTransitionTime":"2025-10-06T08:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.001090 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.001132 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.001144 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.001163 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.001175 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.070499 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.070533 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:44 crc kubenswrapper[4610]: E1006 08:42:44.070661 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-46wzl" podUID="a62060d4-5efa-4c4f-851d-8738476f690e" Oct 06 08:42:44 crc kubenswrapper[4610]: E1006 08:42:44.070808 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.104181 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.104252 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.104266 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.104290 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.104304 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.206480 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.206525 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.206536 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.206552 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.206563 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.309403 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.309457 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.309473 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.309492 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.309505 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.411852 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.411984 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.412019 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.412083 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.412108 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.515595 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.515645 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.515663 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.515696 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.515733 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.618396 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.618455 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.618473 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.618489 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.618501 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.721240 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.721284 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.721296 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.721313 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.721325 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.823921 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.823967 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.823978 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.823995 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.824005 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.927413 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.927463 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.927479 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.927504 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:44 crc kubenswrapper[4610]: I1006 08:42:44.927522 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:44Z","lastTransitionTime":"2025-10-06T08:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.030958 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.031035 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.031092 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.031121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.031146 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:45Z","lastTransitionTime":"2025-10-06T08:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.069616 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.069693 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:45 crc kubenswrapper[4610]: E1006 08:42:45.069825 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:42:45 crc kubenswrapper[4610]: E1006 08:42:45.069935 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.137944 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.138007 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.138026 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.138121 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.138147 4610 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:42:45Z","lastTransitionTime":"2025-10-06T08:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.243571 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.243621 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.243633 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.243649 4610 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.243777 4610 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.295847 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.296327 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.296718 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.299736 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xj96b"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.300020 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.300359 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.300911 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.301761 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.302248 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.302552 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.302654 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.302992 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.306458 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.306910 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-crmz6"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.307307 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.307690 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.308145 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.308334 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.308445 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.309169 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.314316 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.316798 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.316997 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e95710e9-1583-407e-9cee-377d17a9c70d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317125 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-client\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317197 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-serving-cert\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317294 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-config\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317359 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-trusted-ca\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317441 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4rs\" (UniqueName: \"kubernetes.io/projected/e95710e9-1583-407e-9cee-377d17a9c70d-kube-api-access-jw4rs\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317528 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-policies\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317656 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-encryption-config\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.317755 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fef81-f099-4afe-b277-2418e2cd3d8e-serving-cert\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318017 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95710e9-1583-407e-9cee-377d17a9c70d-serving-cert\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318103 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqq7j\" (UniqueName: \"kubernetes.io/projected/b70fef81-f099-4afe-b277-2418e2cd3d8e-kube-api-access-fqq7j\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318162 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52l8d\" (UniqueName: \"kubernetes.io/projected/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-kube-api-access-52l8d\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318191 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318213 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-dir\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.318877 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.319259 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.321132 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.328433 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.328487 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.328539 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.328603 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.334607 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.336911 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.337250 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.337694 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.337850 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9g4kq"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.337564 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.338426 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.337610 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.338540 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.338657 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.338870 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339113 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339186 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339122 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339316 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339558 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339667 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.339742 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.340004 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mm5ft"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.340333 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.340448 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.340464 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341083 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341362 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341607 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341713 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341823 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.341915 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342032 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342166 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342303 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342374 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342443 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342505 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342520 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342538 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342606 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342660 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342077 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342768 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342826 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342847 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342886 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342911 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342937 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342964 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342993 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343039 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343101 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343114 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342472 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343171 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343065 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342085 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342104 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.342125 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343075 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.343517 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.344814 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.344959 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.353583 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.355192 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-klkbs"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.355847 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.356405 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.358529 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.359326 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.360658 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.376542 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.377102 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.377427 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.379444 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.379523 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.380290 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.380543 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.380801 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.380937 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.381077 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.380364 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.381191 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.381397 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.384358 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.384731 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.385846 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.385960 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.386142 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.386885 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.387533 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.387660 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.387768 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.387973 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.388417 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.390329 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.390750 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.393548 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.393821 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.393946 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.393954 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.394129 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.394301 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.394548 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.394081 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.396435 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.400180 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.402464 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.408995 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.411375 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k6pjc"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.411813 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.411913 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.412245 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.414558 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kxqjk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.415331 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.416693 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.418222 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.451959 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.457524 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461795 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461838 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-config\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461879 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-trusted-ca\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461903 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzck\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-kube-api-access-phzck\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461924 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461949 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4rs\" (UniqueName: \"kubernetes.io/projected/e95710e9-1583-407e-9cee-377d17a9c70d-kube-api-access-jw4rs\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.461974 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-policies\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.462251 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.467700 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.469116 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-config\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.469974 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.471353 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.471402 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.471820 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.471856 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-encryption-config\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472228 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95710e9-1583-407e-9cee-377d17a9c70d-serving-cert\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472252 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mkts\" (UniqueName: \"kubernetes.io/projected/f60e974a-cd62-411f-9838-0f03704829fc-kube-api-access-4mkts\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472278 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqq7j\" (UniqueName: \"kubernetes.io/projected/b70fef81-f099-4afe-b277-2418e2cd3d8e-kube-api-access-fqq7j\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472293 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b71d01f-9ec2-42f2-9271-822c00b5c142-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472312 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52l8d\" (UniqueName: \"kubernetes.io/projected/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-kube-api-access-52l8d\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472330 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472347 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472761 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472944 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.472349 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473114 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473153 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwc6p\" (UniqueName: \"kubernetes.io/projected/0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a-kube-api-access-qwc6p\") pod \"downloads-7954f5f757-klkbs\" (UID: \"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a\") " pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473184 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fef81-f099-4afe-b277-2418e2cd3d8e-serving-cert\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473217 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b71d01f-9ec2-42f2-9271-822c00b5c142-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473242 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60e974a-cd62-411f-9838-0f03704829fc-metrics-tls\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473265 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473287 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10a4f9e-fd83-4951-bd62-6e274077d37d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473309 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473332 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473354 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-dir\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473376 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e10a4f9e-fd83-4951-bd62-6e274077d37d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473402 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p789h\" (UniqueName: \"kubernetes.io/projected/2e14ca24-0be1-4cbc-bbdb-68331334706b-kube-api-access-p789h\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473426 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e95710e9-1583-407e-9cee-377d17a9c70d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473447 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e14ca24-0be1-4cbc-bbdb-68331334706b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473477 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-client\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473498 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-serving-cert\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473522 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvq8\" (UniqueName: \"kubernetes.io/projected/4b71d01f-9ec2-42f2-9271-822c00b5c142-kube-api-access-xsvq8\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.473690 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.475476 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b70fef81-f099-4afe-b277-2418e2cd3d8e-trusted-ca\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.475907 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-policies\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476058 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476251 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-audit-dir\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476494 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e95710e9-1583-407e-9cee-377d17a9c70d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476684 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476710 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.476784 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.477081 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.478100 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.478353 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.478467 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.478876 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.480627 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.481174 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.481475 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.482492 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.482770 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.482880 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.492833 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.493122 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-etcd-client\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.494084 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.494627 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x8h4d"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.495024 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.495408 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.495625 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.495797 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.496227 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.496630 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.496819 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.497235 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.498540 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95710e9-1583-407e-9cee-377d17a9c70d-serving-cert\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.498599 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.498730 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70fef81-f099-4afe-b277-2418e2cd3d8e-serving-cert\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.499076 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-encryption-config\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.499367 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.501330 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.502000 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.502553 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.502998 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.506753 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-serving-cert\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.507939 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.510140 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.518457 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-crmz6"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.519731 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-99z72"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.520183 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.520854 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.521327 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.527112 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.527894 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.527970 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qrfc4"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.528596 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.530341 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.531208 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.541158 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xj96b"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.542016 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.542248 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5fwjj"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.543112 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.543332 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.548214 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9g4kq"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.549619 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mm5ft"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.556431 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fjb8x"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.559427 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.559547 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.560896 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-s5jn2"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.561779 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.564392 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574347 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b71d01f-9ec2-42f2-9271-822c00b5c142-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574473 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60e974a-cd62-411f-9838-0f03704829fc-metrics-tls\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574509 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574542 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10a4f9e-fd83-4951-bd62-6e274077d37d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574565 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574590 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e10a4f9e-fd83-4951-bd62-6e274077d37d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574618 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p789h\" (UniqueName: \"kubernetes.io/projected/2e14ca24-0be1-4cbc-bbdb-68331334706b-kube-api-access-p789h\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574649 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e14ca24-0be1-4cbc-bbdb-68331334706b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574709 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvq8\" (UniqueName: \"kubernetes.io/projected/4b71d01f-9ec2-42f2-9271-822c00b5c142-kube-api-access-xsvq8\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574749 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574801 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzck\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-kube-api-access-phzck\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574847 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574961 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.574984 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.575054 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mkts\" (UniqueName: \"kubernetes.io/projected/f60e974a-cd62-411f-9838-0f03704829fc-kube-api-access-4mkts\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.575130 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b71d01f-9ec2-42f2-9271-822c00b5c142-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.575191 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.575230 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.576069 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwc6p\" (UniqueName: \"kubernetes.io/projected/0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a-kube-api-access-qwc6p\") pod \"downloads-7954f5f757-klkbs\" (UID: \"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a\") " pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.576267 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b71d01f-9ec2-42f2-9271-822c00b5c142-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.579625 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10a4f9e-fd83-4951-bd62-6e274077d37d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.580169 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.581517 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.581968 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k6pjc"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.582691 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.583593 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.584033 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.586713 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kxqjk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.588087 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.589251 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.591436 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e10a4f9e-fd83-4951-bd62-6e274077d37d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.594885 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b71d01f-9ec2-42f2-9271-822c00b5c142-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.595676 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.598487 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.598565 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.598835 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e14ca24-0be1-4cbc-bbdb-68331334706b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.599277 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60e974a-cd62-411f-9838-0f03704829fc-metrics-tls\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.602009 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.602240 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.603803 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.605098 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.606966 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.607703 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.609408 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.611172 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.612853 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.615923 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-klkbs"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.617119 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.622702 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.622748 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.628108 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.629580 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.630941 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.632464 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.633951 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x8h4d"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.635538 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5fwjj"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.637219 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.638222 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjgjr"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.639757 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qrfc4"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.639901 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.640906 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjgjr"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.642336 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.643855 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fjb8x"] Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.664306 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.682464 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.703204 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.722208 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.742076 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.762260 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.781938 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.802609 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.822079 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.842700 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.863082 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.882856 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.902214 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.928892 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.942927 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.963359 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 08:42:45 crc kubenswrapper[4610]: I1006 08:42:45.981765 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.036572 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqq7j\" (UniqueName: \"kubernetes.io/projected/b70fef81-f099-4afe-b277-2418e2cd3d8e-kube-api-access-fqq7j\") pod \"console-operator-58897d9998-xj96b\" (UID: \"b70fef81-f099-4afe-b277-2418e2cd3d8e\") " pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.056385 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4rs\" (UniqueName: \"kubernetes.io/projected/e95710e9-1583-407e-9cee-377d17a9c70d-kube-api-access-jw4rs\") pod \"openshift-config-operator-7777fb866f-lzjcc\" (UID: \"e95710e9-1583-407e-9cee-377d17a9c70d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.069892 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.070595 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.077476 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52l8d\" (UniqueName: \"kubernetes.io/projected/3f6c789d-43eb-48d0-aad4-cd6eef7bc706-kube-api-access-52l8d\") pod \"apiserver-7bbb656c7d-l4qt7\" (UID: \"3f6c789d-43eb-48d0-aad4-cd6eef7bc706\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.081878 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.101945 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.123353 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.148121 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.161351 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.182904 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.204203 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.222649 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.229936 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.242037 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.243291 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.262458 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.281520 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.283216 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.302846 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.324650 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.342444 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.363506 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.382725 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.404162 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.422928 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.442941 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.463438 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.486841 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.502009 4610 request.go:700] Waited for 1.005997425s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-kzf4t&limit=500&resourceVersion=0 Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.505370 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.512195 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xj96b"] Oct 06 08:42:46 crc kubenswrapper[4610]: W1006 08:42:46.521112 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70fef81_f099_4afe_b277_2418e2cd3d8e.slice/crio-b75f92743284e1ae83040c2c281701174fd187f4c8aea1b07d35df31dacd64da WatchSource:0}: Error finding container b75f92743284e1ae83040c2c281701174fd187f4c8aea1b07d35df31dacd64da: Status 404 returned error can't find the container with id b75f92743284e1ae83040c2c281701174fd187f4c8aea1b07d35df31dacd64da Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.521998 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.541666 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.562342 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.582901 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.602822 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.626181 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.644310 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.653883 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc"] Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.655110 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7"] Oct 06 08:42:46 crc kubenswrapper[4610]: W1006 08:42:46.669216 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6c789d_43eb_48d0_aad4_cd6eef7bc706.slice/crio-16b775418e986778eb5a92dbba80b0db9ef28ce6a4b3d07f5424857c846f0b8b WatchSource:0}: Error finding container 16b775418e986778eb5a92dbba80b0db9ef28ce6a4b3d07f5424857c846f0b8b: Status 404 returned error can't find the container with id 16b775418e986778eb5a92dbba80b0db9ef28ce6a4b3d07f5424857c846f0b8b Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.669458 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.682264 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.689481 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xj96b" event={"ID":"b70fef81-f099-4afe-b277-2418e2cd3d8e","Type":"ContainerStarted","Data":"b75f92743284e1ae83040c2c281701174fd187f4c8aea1b07d35df31dacd64da"} Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.690468 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" event={"ID":"3f6c789d-43eb-48d0-aad4-cd6eef7bc706","Type":"ContainerStarted","Data":"16b775418e986778eb5a92dbba80b0db9ef28ce6a4b3d07f5424857c846f0b8b"} Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.691329 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" event={"ID":"e95710e9-1583-407e-9cee-377d17a9c70d","Type":"ContainerStarted","Data":"53a9879e946fa42f24cc2ae6b272096b11be834fe4eba7938131079522be29a4"} Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.703070 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.722629 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.746784 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.762346 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.782336 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.803232 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.822397 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.842395 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.862549 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.882692 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.903575 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.922778 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.942387 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.962932 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 08:42:46 crc kubenswrapper[4610]: I1006 08:42:46.983315 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.003432 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.022103 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.042199 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.061935 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.069734 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.070826 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.082622 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.101606 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.142383 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.161653 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.182694 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193784 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193821 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193846 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193864 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwf6\" (UniqueName: \"kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193893 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193908 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193928 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dadf4701-13e0-4382-a658-2e2bd9d52ecb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193958 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afc38731-aa1b-4009-a1f0-5877684d53f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193973 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc38731-aa1b-4009-a1f0-5877684d53f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.193986 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afc38731-aa1b-4009-a1f0-5877684d53f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194001 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-service-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194025 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194058 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1e8485-a57c-4aab-bf55-01fc322047cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194132 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db79ee81-c008-4374-9523-e762c47c9668-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194150 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194202 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86ad86a-2294-4bd5-b065-f590c5d46c19-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194225 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194255 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194277 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194319 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194341 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194364 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194379 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194396 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194412 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2wl\" (UniqueName: \"kubernetes.io/projected/2ed7e624-4555-4dc2-85dc-cb3e305262dd-kube-api-access-rf2wl\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194426 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b1e8485-a57c-4aab-bf55-01fc322047cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194452 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vpf\" (UniqueName: \"kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194467 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadf4701-13e0-4382-a658-2e2bd9d52ecb-config\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194480 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dadf4701-13e0-4382-a658-2e2bd9d52ecb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194495 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1e8485-a57c-4aab-bf55-01fc322047cf-config\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194510 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-auth-proxy-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194527 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194551 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194570 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8c6\" (UniqueName: \"kubernetes.io/projected/e86ad86a-2294-4bd5-b065-f590c5d46c19-kube-api-access-6v8c6\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194585 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nkmc\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194600 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194614 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt6z\" (UniqueName: \"kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194630 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194652 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194668 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194682 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-config\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194697 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194711 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ed7e624-4555-4dc2-85dc-cb3e305262dd-machine-approver-tls\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194725 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194749 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194772 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86ad86a-2294-4bd5-b065-f590c5d46c19-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194788 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb88n\" (UniqueName: \"kubernetes.io/projected/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-kube-api-access-tb88n\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194804 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194819 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-serving-cert\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194834 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-images\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194850 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194866 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194881 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194905 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194928 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqh7w\" (UniqueName: \"kubernetes.io/projected/db79ee81-c008-4374-9523-e762c47c9668-kube-api-access-qqh7w\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194947 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-config\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.194986 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.195001 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.196735 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:47.696722923 +0000 UTC m=+99.411776311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.202466 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.222397 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.242478 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.263535 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.282340 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.295645 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.295891 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:47.795862987 +0000 UTC m=+99.510916395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296204 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296292 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296334 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-csi-data-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296364 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296390 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwx5\" (UniqueName: \"kubernetes.io/projected/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-kube-api-access-vvwx5\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296421 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjtf\" (UniqueName: \"kubernetes.io/projected/f8b408c0-509e-4ff0-9688-7d142ec0a14e-kube-api-access-lfjtf\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296480 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-srv-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296566 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296612 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afc38731-aa1b-4009-a1f0-5877684d53f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296645 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296674 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dadf4701-13e0-4382-a658-2e2bd9d52ecb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296704 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-image-import-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296753 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296782 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296810 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1e8485-a57c-4aab-bf55-01fc322047cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296840 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52n5\" (UniqueName: \"kubernetes.io/projected/c86e3a40-53eb-466d-ad93-268de0cb9e0d-kube-api-access-x52n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296870 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296900 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296933 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296968 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mkk\" (UniqueName: \"kubernetes.io/projected/351aa4d4-e29f-4405-9985-5953396ca08e-kube-api-access-52mkk\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.296995 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fzl\" (UniqueName: \"kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297024 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/351aa4d4-e29f-4405-9985-5953396ca08e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297113 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86ad86a-2294-4bd5-b065-f590c5d46c19-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297148 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297206 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-config\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297237 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-node-bootstrap-token\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297270 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297303 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297309 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297333 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297405 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297448 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-webhook-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297490 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e3a40-53eb-466d-ad93-268de0cb9e0d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297526 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-metrics-certs\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297623 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vpf\" (UniqueName: \"kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297670 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-encryption-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297707 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-auth-proxy-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297746 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-images\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297798 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297836 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8c6\" (UniqueName: \"kubernetes.io/projected/e86ad86a-2294-4bd5-b065-f590c5d46c19-kube-api-access-6v8c6\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297876 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297913 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt6z\" (UniqueName: \"kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297948 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.297986 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-serving-cert\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298019 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-default-certificate\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298103 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5f2d\" (UniqueName: \"kubernetes.io/projected/fb2950a4-c31b-47fd-bc69-84015e5e58c5-kube-api-access-z5f2d\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298142 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298174 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e3a40-53eb-466d-ad93-268de0cb9e0d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298213 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298247 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-mountpoint-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298279 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-service-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298314 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298346 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgch\" (UniqueName: \"kubernetes.io/projected/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-kube-api-access-zrgch\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298405 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298439 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-config\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298460 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298471 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-registration-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298510 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-client\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298539 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298564 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-socket-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298595 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9e5708d8-3116-4947-a74f-9551fbfdb501-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298646 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-apiservice-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298672 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqh7w\" (UniqueName: \"kubernetes.io/projected/db79ee81-c008-4374-9523-e762c47c9668-kube-api-access-qqh7w\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298695 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-plugins-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298719 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bx84\" (UniqueName: \"kubernetes.io/projected/b549d43d-f011-4c3b-9fd6-b3af936f56ed-kube-api-access-7bx84\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298769 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-config\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298791 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298818 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt2t\" (UniqueName: \"kubernetes.io/projected/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-kube-api-access-hmt2t\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298841 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5kz\" (UniqueName: \"kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298863 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkr5v\" (UniqueName: \"kubernetes.io/projected/70fadbce-fb8b-4f7a-bbff-3036aba2a51e-kube-api-access-mkr5v\") pod \"migrator-59844c95c7-5xmkk\" (UID: \"70fadbce-fb8b-4f7a-bbff-3036aba2a51e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298885 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298908 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-tmpfs\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.298927 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-srv-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.299908 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:47.79986462 +0000 UTC m=+99.514918018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.300298 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ed7e624-4555-4dc2-85dc-cb3e305262dd-auth-proxy-config\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.301787 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304074 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304297 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-config\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304339 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304523 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1e8485-a57c-4aab-bf55-01fc322047cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304602 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfd9\" (UniqueName: \"kubernetes.io/projected/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-kube-api-access-qhfd9\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304633 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f59d\" (UniqueName: \"kubernetes.io/projected/5a502918-8f04-4208-b05c-d1d3fe1f7110-kube-api-access-4f59d\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304706 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwf6\" (UniqueName: \"kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304735 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55cea12f-f7be-4dde-9f2a-906005391f52-signing-key\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304762 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304755 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304891 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-stats-auth\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304914 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-certs\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304948 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc38731-aa1b-4009-a1f0-5877684d53f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304971 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afc38731-aa1b-4009-a1f0-5877684d53f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.304994 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-service-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305018 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96x75\" (UniqueName: \"kubernetes.io/projected/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-kube-api-access-96x75\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305057 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-serving-cert\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305092 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305104 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305128 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db79ee81-c008-4374-9523-e762c47c9668-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305169 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55cea12f-f7be-4dde-9f2a-906005391f52-signing-cabundle\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305193 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dd5x\" (UniqueName: \"kubernetes.io/projected/9e5708d8-3116-4947-a74f-9551fbfdb501-kube-api-access-6dd5x\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305213 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137638ac-cb39-4dc4-b21a-93f89b9297b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305236 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305801 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86ad86a-2294-4bd5-b065-f590c5d46c19-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.305840 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.307040 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.307136 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.307880 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.307937 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.307973 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308061 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308147 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-service-ca-bundle\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308210 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-profile-collector-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308271 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308317 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlm4\" (UniqueName: \"kubernetes.io/projected/8b1b1162-9959-4680-b21e-078eb49dba97-kube-api-access-xtlm4\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308383 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308460 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308531 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2wl\" (UniqueName: \"kubernetes.io/projected/2ed7e624-4555-4dc2-85dc-cb3e305262dd-kube-api-access-rf2wl\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308563 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afc38731-aa1b-4009-a1f0-5877684d53f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308637 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b1e8485-a57c-4aab-bf55-01fc322047cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308698 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-serving-cert\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308777 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadf4701-13e0-4382-a658-2e2bd9d52ecb-config\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308802 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dadf4701-13e0-4382-a658-2e2bd9d52ecb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308827 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308888 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit-dir\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.308959 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309020 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1e8485-a57c-4aab-bf55-01fc322047cf-config\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309115 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b04ea21-e24d-4d1c-861e-28746c304f7d-service-ca-bundle\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309211 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588hb\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-kube-api-access-588hb\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309250 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m674x\" (UniqueName: \"kubernetes.io/projected/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-kube-api-access-m674x\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309284 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a502918-8f04-4208-b05c-d1d3fe1f7110-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309350 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nkmc\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309400 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309484 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309520 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137638ac-cb39-4dc4-b21a-93f89b9297b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309553 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-config-volume\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309586 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69z8\" (UniqueName: \"kubernetes.io/projected/0b04ea21-e24d-4d1c-861e-28746c304f7d-kube-api-access-g69z8\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309600 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dadf4701-13e0-4382-a658-2e2bd9d52ecb-config\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309620 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-config\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309659 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhd4\" (UniqueName: \"kubernetes.io/projected/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-kube-api-access-gbhd4\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309662 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.309787 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310486 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310649 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-config\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310712 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310725 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310791 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ed7e624-4555-4dc2-85dc-cb3e305262dd-machine-approver-tls\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310822 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkgqq\" (UniqueName: \"kubernetes.io/projected/55cea12f-f7be-4dde-9f2a-906005391f52-kube-api-access-jkgqq\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310852 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310879 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-node-pullsecrets\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310952 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b1b1162-9959-4680-b21e-078eb49dba97-cert\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.310978 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.311033 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86ad86a-2294-4bd5-b065-f590c5d46c19-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.311096 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.311587 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.311716 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1e8485-a57c-4aab-bf55-01fc322047cf-config\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.311917 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312123 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312249 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-serving-cert\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312303 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb88n\" (UniqueName: \"kubernetes.io/projected/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-kube-api-access-tb88n\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312347 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-metrics-tls\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312411 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-images\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312514 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312827 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.312878 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a502918-8f04-4208-b05c-d1d3fe1f7110-proxy-tls\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.313784 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db79ee81-c008-4374-9523-e762c47c9668-images\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.313871 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314304 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314669 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314725 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db79ee81-c008-4374-9523-e762c47c9668-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314754 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314849 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.314914 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-client\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.315194 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.315314 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whlw\" (UniqueName: \"kubernetes.io/projected/889b475b-a608-4aa3-ad0b-1824bd08b2a2-kube-api-access-5whlw\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.315491 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.315575 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc38731-aa1b-4009-a1f0-5877684d53f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.316095 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.316938 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.317005 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.317752 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dadf4701-13e0-4382-a658-2e2bd9d52ecb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.318230 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-serving-cert\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.318622 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.318690 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86ad86a-2294-4bd5-b065-f590c5d46c19-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.319502 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.321689 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.323017 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.324191 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afc38731-aa1b-4009-a1f0-5877684d53f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.324744 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ed7e624-4555-4dc2-85dc-cb3e305262dd-machine-approver-tls\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.341706 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.361598 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.382024 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.403250 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421002 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421304 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whlw\" (UniqueName: \"kubernetes.io/projected/889b475b-a608-4aa3-ad0b-1824bd08b2a2-kube-api-access-5whlw\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421359 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-csi-data-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421392 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421442 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwx5\" (UniqueName: \"kubernetes.io/projected/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-kube-api-access-vvwx5\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421475 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjtf\" (UniqueName: \"kubernetes.io/projected/f8b408c0-509e-4ff0-9688-7d142ec0a14e-kube-api-access-lfjtf\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421508 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-srv-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421641 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-image-import-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421689 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421729 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x52n5\" (UniqueName: \"kubernetes.io/projected/c86e3a40-53eb-466d-ad93-268de0cb9e0d-kube-api-access-x52n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421781 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421834 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421880 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mkk\" (UniqueName: \"kubernetes.io/projected/351aa4d4-e29f-4405-9985-5953396ca08e-kube-api-access-52mkk\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421920 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fzl\" (UniqueName: \"kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421954 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/351aa4d4-e29f-4405-9985-5953396ca08e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.421990 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-config\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422025 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-node-bootstrap-token\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422090 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422124 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-webhook-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422158 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e3a40-53eb-466d-ad93-268de0cb9e0d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422191 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-metrics-certs\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422225 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-encryption-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-images\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422329 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422362 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-serving-cert\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422423 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5f2d\" (UniqueName: \"kubernetes.io/projected/fb2950a4-c31b-47fd-bc69-84015e5e58c5-kube-api-access-z5f2d\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422470 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-default-certificate\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422516 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e3a40-53eb-466d-ad93-268de0cb9e0d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422590 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-mountpoint-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422623 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-service-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422656 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422697 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgch\" (UniqueName: \"kubernetes.io/projected/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-kube-api-access-zrgch\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422752 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-config\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422799 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-registration-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-client\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.422894 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-socket-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423005 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9e5708d8-3116-4947-a74f-9551fbfdb501-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423077 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-apiservice-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423127 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-plugins-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423164 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bx84\" (UniqueName: \"kubernetes.io/projected/b549d43d-f011-4c3b-9fd6-b3af936f56ed-kube-api-access-7bx84\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423210 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423251 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt2t\" (UniqueName: \"kubernetes.io/projected/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-kube-api-access-hmt2t\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423285 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5kz\" (UniqueName: \"kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423318 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-tmpfs\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423351 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-srv-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423384 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkr5v\" (UniqueName: \"kubernetes.io/projected/70fadbce-fb8b-4f7a-bbff-3036aba2a51e-kube-api-access-mkr5v\") pod \"migrator-59844c95c7-5xmkk\" (UID: \"70fadbce-fb8b-4f7a-bbff-3036aba2a51e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423416 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423452 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfd9\" (UniqueName: \"kubernetes.io/projected/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-kube-api-access-qhfd9\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423486 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f59d\" (UniqueName: \"kubernetes.io/projected/5a502918-8f04-4208-b05c-d1d3fe1f7110-kube-api-access-4f59d\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423550 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55cea12f-f7be-4dde-9f2a-906005391f52-signing-key\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423599 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x75\" (UniqueName: \"kubernetes.io/projected/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-kube-api-access-96x75\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423631 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-stats-auth\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423663 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-certs\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423710 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-serving-cert\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423743 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423792 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dd5x\" (UniqueName: \"kubernetes.io/projected/9e5708d8-3116-4947-a74f-9551fbfdb501-kube-api-access-6dd5x\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423825 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137638ac-cb39-4dc4-b21a-93f89b9297b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423857 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423891 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55cea12f-f7be-4dde-9f2a-906005391f52-signing-cabundle\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423934 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-profile-collector-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.423993 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424067 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlm4\" (UniqueName: \"kubernetes.io/projected/8b1b1162-9959-4680-b21e-078eb49dba97-kube-api-access-xtlm4\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424147 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-serving-cert\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424199 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424232 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit-dir\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424275 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b04ea21-e24d-4d1c-861e-28746c304f7d-service-ca-bundle\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424310 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m674x\" (UniqueName: \"kubernetes.io/projected/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-kube-api-access-m674x\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424347 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a502918-8f04-4208-b05c-d1d3fe1f7110-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424386 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588hb\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-kube-api-access-588hb\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424442 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137638ac-cb39-4dc4-b21a-93f89b9297b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424474 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-config-volume\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424507 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69z8\" (UniqueName: \"kubernetes.io/projected/0b04ea21-e24d-4d1c-861e-28746c304f7d-kube-api-access-g69z8\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424578 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhd4\" (UniqueName: \"kubernetes.io/projected/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-kube-api-access-gbhd4\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424632 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkgqq\" (UniqueName: \"kubernetes.io/projected/55cea12f-f7be-4dde-9f2a-906005391f52-kube-api-access-jkgqq\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424679 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-node-pullsecrets\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424724 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424845 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b1b1162-9959-4680-b21e-078eb49dba97-cert\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424904 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.424973 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-metrics-tls\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.425021 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a502918-8f04-4208-b05c-d1d3fe1f7110-proxy-tls\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.425103 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-client\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.425157 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.429091 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:47.929022478 +0000 UTC m=+99.644075906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.429378 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-csi-data-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.430468 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.432833 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.434370 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-srv-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.434591 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-tmpfs\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.435775 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.436789 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-image-import-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.437501 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.438558 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-serving-ca\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.439820 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.441359 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-srv-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.441802 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b549d43d-f011-4c3b-9fd6-b3af936f56ed-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.442192 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-config-volume\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443201 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-stats-auth\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443289 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/351aa4d4-e29f-4405-9985-5953396ca08e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443522 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137638ac-cb39-4dc4-b21a-93f89b9297b6-trusted-ca\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443601 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-config\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443693 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb2950a4-c31b-47fd-bc69-84015e5e58c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.443996 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55cea12f-f7be-4dde-9f2a-906005391f52-signing-cabundle\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.444017 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.444689 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-profile-collector-cert\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.444956 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.445599 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.445649 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-audit-dir\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.446288 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b04ea21-e24d-4d1c-861e-28746c304f7d-service-ca-bundle\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.447102 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-config\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.447200 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55cea12f-f7be-4dde-9f2a-906005391f52-signing-key\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.447331 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-registration-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.447457 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.447940 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b549d43d-f011-4c3b-9fd6-b3af936f56ed-node-pullsecrets\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.448015 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-socket-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.448210 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.451716 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-apiservice-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.452001 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-serving-cert\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.452288 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-client\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.461055 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-serving-cert\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.461642 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-webhook-cert\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.461914 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.461941 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/889b475b-a608-4aa3-ad0b-1824bd08b2a2-etcd-service-ca\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.462330 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e3a40-53eb-466d-ad93-268de0cb9e0d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.462462 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-images\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.462723 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.462854 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-metrics-certs\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.462935 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-serving-cert\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.463118 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.463335 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-mountpoint-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.463109 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8b408c0-509e-4ff0-9688-7d142ec0a14e-plugins-dir\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.464641 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a502918-8f04-4208-b05c-d1d3fe1f7110-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.464858 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9e5708d8-3116-4947-a74f-9551fbfdb501-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.465246 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e3a40-53eb-466d-ad93-268de0cb9e0d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.466201 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-encryption-config\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.467201 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-node-bootstrap-token\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.467459 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-certs\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.467738 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b549d43d-f011-4c3b-9fd6-b3af936f56ed-etcd-client\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.468827 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b04ea21-e24d-4d1c-861e-28746c304f7d-default-certificate\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.469464 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5a502918-8f04-4208-b05c-d1d3fe1f7110-proxy-tls\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.469827 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137638ac-cb39-4dc4-b21a-93f89b9297b6-metrics-tls\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.479020 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mkts\" (UniqueName: \"kubernetes.io/projected/f60e974a-cd62-411f-9838-0f03704829fc-kube-api-access-4mkts\") pod \"dns-operator-744455d44c-crmz6\" (UID: \"f60e974a-cd62-411f-9838-0f03704829fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.485182 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.489930 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b1b1162-9959-4680-b21e-078eb49dba97-cert\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.493753 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-metrics-tls\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.504343 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvq8\" (UniqueName: \"kubernetes.io/projected/4b71d01f-9ec2-42f2-9271-822c00b5c142-kube-api-access-xsvq8\") pod \"openshift-apiserver-operator-796bbdcf4f-7vsnw\" (UID: \"4b71d01f-9ec2-42f2-9271-822c00b5c142\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.504444 4610 request.go:700] Waited for 1.928899545s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.522869 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzck\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-kube-api-access-phzck\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.526020 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.526692 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.026667299 +0000 UTC m=+99.741720877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.540586 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10a4f9e-fd83-4951-bd62-6e274077d37d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-npdc2\" (UID: \"e10a4f9e-fd83-4951-bd62-6e274077d37d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.573119 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p789h\" (UniqueName: \"kubernetes.io/projected/2e14ca24-0be1-4cbc-bbdb-68331334706b-kube-api-access-p789h\") pod \"cluster-samples-operator-665b6dd947-n88h5\" (UID: \"2e14ca24-0be1-4cbc-bbdb-68331334706b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.582183 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwc6p\" (UniqueName: \"kubernetes.io/projected/0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a-kube-api-access-qwc6p\") pod \"downloads-7954f5f757-klkbs\" (UID: \"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a\") " pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.596433 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq\") pod \"console-f9d7485db-8p28v\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.602828 4610 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.615219 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.623151 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.627193 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.627387 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.127358226 +0000 UTC m=+99.842411624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.627521 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.628660 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.128633532 +0000 UTC m=+99.843686970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.629359 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.643220 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.683329 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.699719 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xj96b" event={"ID":"b70fef81-f099-4afe-b277-2418e2cd3d8e","Type":"ContainerStarted","Data":"6d49c22e71a150f9695821c4cb86a023c9e54d802066993ff40a5b52dad73fc9"} Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.700622 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.701854 4610 patch_prober.go:28] interesting pod/console-operator-58897d9998-xj96b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.701888 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xj96b" podUID="b70fef81-f099-4afe-b277-2418e2cd3d8e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.705834 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.743937 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.748504 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.748833 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.749026 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.749316 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.249299698 +0000 UTC m=+99.964353086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.752336 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.758176 4610 generic.go:334] "Generic (PLEG): container finished" podID="3f6c789d-43eb-48d0-aad4-cd6eef7bc706" containerID="60ecd124d170a220801fdcc8a2fb348d3d5ae0deb28a53b4a72fe5bf7bbbc78f" exitCode=0 Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.758245 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" event={"ID":"3f6c789d-43eb-48d0-aad4-cd6eef7bc706","Type":"ContainerDied","Data":"60ecd124d170a220801fdcc8a2fb348d3d5ae0deb28a53b4a72fe5bf7bbbc78f"} Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.761613 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.762864 4610 generic.go:334] "Generic (PLEG): container finished" podID="e95710e9-1583-407e-9cee-377d17a9c70d" containerID="c915fe7a1b8d8e238a7e05a86dd416e9d9e8c6df9cc92e3ba81b8b0ce95ea94e" exitCode=0 Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.762910 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" event={"ID":"e95710e9-1583-407e-9cee-377d17a9c70d","Type":"ContainerDied","Data":"c915fe7a1b8d8e238a7e05a86dd416e9d9e8c6df9cc92e3ba81b8b0ce95ea94e"} Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.785500 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.789014 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.826639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vpf\" (UniqueName: \"kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf\") pod \"route-controller-manager-6576b87f9c-2ktvk\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.850096 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8c6\" (UniqueName: \"kubernetes.io/projected/e86ad86a-2294-4bd5-b065-f590c5d46c19-kube-api-access-6v8c6\") pod \"openshift-controller-manager-operator-756b6f6bc6-2pc2t\" (UID: \"e86ad86a-2294-4bd5-b065-f590c5d46c19\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.850746 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.852171 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.352154317 +0000 UTC m=+100.067207705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.865912 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqh7w\" (UniqueName: \"kubernetes.io/projected/db79ee81-c008-4374-9523-e762c47c9668-kube-api-access-qqh7w\") pod \"machine-api-operator-5694c8668f-9g4kq\" (UID: \"db79ee81-c008-4374-9523-e762c47c9668\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.874909 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.894335 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwf6\" (UniqueName: \"kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6\") pod \"oauth-openshift-558db77b4-mgm5v\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.896501 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5"] Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.910911 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt6z\" (UniqueName: \"kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z\") pod \"controller-manager-879f6c89f-7qsjg\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.939268 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.944602 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afc38731-aa1b-4009-a1f0-5877684d53f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jf7qm\" (UID: \"afc38731-aa1b-4009-a1f0-5877684d53f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.951519 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.951562 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-crmz6"] Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.951720 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.451703922 +0000 UTC m=+100.166757310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.951885 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:47 crc kubenswrapper[4610]: E1006 08:42:47.952608 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.452597087 +0000 UTC m=+100.167650475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.958405 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.963319 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2wl\" (UniqueName: \"kubernetes.io/projected/2ed7e624-4555-4dc2-85dc-cb3e305262dd-kube-api-access-rf2wl\") pod \"machine-approver-56656f9798-8sxnf\" (UID: \"2ed7e624-4555-4dc2-85dc-cb3e305262dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.971542 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" Oct 06 08:42:47 crc kubenswrapper[4610]: I1006 08:42:47.981916 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b1e8485-a57c-4aab-bf55-01fc322047cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j66hd\" (UID: \"6b1e8485-a57c-4aab-bf55-01fc322047cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.000277 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.006470 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dadf4701-13e0-4382-a658-2e2bd9d52ecb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf2pn\" (UID: \"dadf4701-13e0-4382-a658-2e2bd9d52ecb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.020367 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.034369 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.035117 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.043538 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb88n\" (UniqueName: \"kubernetes.io/projected/4c4276ab-f0c0-483a-96b6-78e45d3a8a2e-kube-api-access-tb88n\") pod \"authentication-operator-69f744f599-mm5ft\" (UID: \"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.048520 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nkmc\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.058228 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.058853 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-klkbs"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.058895 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.059369 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.559332387 +0000 UTC m=+100.274385775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.065929 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bx84\" (UniqueName: \"kubernetes.io/projected/b549d43d-f011-4c3b-9fd6-b3af936f56ed-kube-api-access-7bx84\") pod \"apiserver-76f77b778f-kxqjk\" (UID: \"b549d43d-f011-4c3b-9fd6-b3af936f56ed\") " pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.067582 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.095464 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whlw\" (UniqueName: \"kubernetes.io/projected/889b475b-a608-4aa3-ad0b-1824bd08b2a2-kube-api-access-5whlw\") pod \"etcd-operator-b45778765-k6pjc\" (UID: \"889b475b-a608-4aa3-ad0b-1824bd08b2a2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.103629 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwx5\" (UniqueName: \"kubernetes.io/projected/5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c-kube-api-access-vvwx5\") pod \"machine-config-operator-74547568cd-7bcxq\" (UID: \"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.104167 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.114317 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.123953 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjtf\" (UniqueName: \"kubernetes.io/projected/f8b408c0-509e-4ff0-9688-7d142ec0a14e-kube-api-access-lfjtf\") pod \"csi-hostpathplugin-kjgjr\" (UID: \"f8b408c0-509e-4ff0-9688-7d142ec0a14e\") " pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.145175 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt2t\" (UniqueName: \"kubernetes.io/projected/c7b26f53-77fe-4ee8-a966-f95ad3dcaae1-kube-api-access-hmt2t\") pod \"packageserver-d55dfcdfc-7z998\" (UID: \"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.151030 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.158356 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.160769 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.161245 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.661231698 +0000 UTC m=+100.376285086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.165375 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5kz\" (UniqueName: \"kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz\") pod \"marketplace-operator-79b997595-t5gpw\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.182741 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52n5\" (UniqueName: \"kubernetes.io/projected/c86e3a40-53eb-466d-ad93-268de0cb9e0d-kube-api-access-x52n5\") pod \"kube-storage-version-migrator-operator-b67b599dd-75clg\" (UID: \"c86e3a40-53eb-466d-ad93-268de0cb9e0d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.211217 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkr5v\" (UniqueName: \"kubernetes.io/projected/70fadbce-fb8b-4f7a-bbff-3036aba2a51e-kube-api-access-mkr5v\") pod \"migrator-59844c95c7-5xmkk\" (UID: \"70fadbce-fb8b-4f7a-bbff-3036aba2a51e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" Oct 06 08:42:48 crc kubenswrapper[4610]: W1006 08:42:48.215246 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed7e624_4555_4dc2_85dc_cb3e305262dd.slice/crio-8af8195b6488aaa24e2a18493d3555169226eb37cca8d3196be22166bbae03e2 WatchSource:0}: Error finding container 8af8195b6488aaa24e2a18493d3555169226eb37cca8d3196be22166bbae03e2: Status 404 returned error can't find the container with id 8af8195b6488aaa24e2a18493d3555169226eb37cca8d3196be22166bbae03e2 Oct 06 08:42:48 crc kubenswrapper[4610]: W1006 08:42:48.230237 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3e1eb2_9480_4350_8a62_c8c25f8dcc7a.slice/crio-6f7d323a317ac13f09c9a64326052bfabe88c86a35236ba7aff9851ea904e88a WatchSource:0}: Error finding container 6f7d323a317ac13f09c9a64326052bfabe88c86a35236ba7aff9851ea904e88a: Status 404 returned error can't find the container with id 6f7d323a317ac13f09c9a64326052bfabe88c86a35236ba7aff9851ea904e88a Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.244910 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfd9\" (UniqueName: \"kubernetes.io/projected/11417b4a-6ad4-44ea-9ba1-45a98d2fb619-kube-api-access-qhfd9\") pod \"catalog-operator-68c6474976-27r27\" (UID: \"11417b4a-6ad4-44ea-9ba1-45a98d2fb619\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.247448 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.264539 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.265349 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.765329072 +0000 UTC m=+100.480382460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.272799 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f59d\" (UniqueName: \"kubernetes.io/projected/5a502918-8f04-4208-b05c-d1d3fe1f7110-kube-api-access-4f59d\") pod \"machine-config-controller-84d6567774-rjrwh\" (UID: \"5a502918-8f04-4208-b05c-d1d3fe1f7110\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.274947 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dd5x\" (UniqueName: \"kubernetes.io/projected/9e5708d8-3116-4947-a74f-9551fbfdb501-kube-api-access-6dd5x\") pod \"multus-admission-controller-857f4d67dd-x8h4d\" (UID: \"9e5708d8-3116-4947-a74f-9551fbfdb501\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.279550 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.318879 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mkk\" (UniqueName: \"kubernetes.io/projected/351aa4d4-e29f-4405-9985-5953396ca08e-kube-api-access-52mkk\") pod \"control-plane-machine-set-operator-78cbb6b69f-hljsk\" (UID: \"351aa4d4-e29f-4405-9985-5953396ca08e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.327218 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc700e3e-8574-4d4d-bd15-ba8e0b21a51f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6926w\" (UID: \"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.342630 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fzl\" (UniqueName: \"kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl\") pod \"collect-profiles-29328990-22k59\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.342661 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96x75\" (UniqueName: \"kubernetes.io/projected/f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b-kube-api-access-96x75\") pod \"service-ca-operator-777779d784-8n2hw\" (UID: \"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.351154 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.351184 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.359739 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlm4\" (UniqueName: \"kubernetes.io/projected/8b1b1162-9959-4680-b21e-078eb49dba97-kube-api-access-xtlm4\") pod \"ingress-canary-5fwjj\" (UID: \"8b1b1162-9959-4680-b21e-078eb49dba97\") " pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.366718 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.367111 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.86709883 +0000 UTC m=+100.582152218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.384971 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.398357 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.402125 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.405487 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m674x\" (UniqueName: \"kubernetes.io/projected/e4afea40-045a-4d72-92a9-f82cb6fe9cf3-kube-api-access-m674x\") pod \"dns-default-fjb8x\" (UID: \"e4afea40-045a-4d72-92a9-f82cb6fe9cf3\") " pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.406004 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5f2d\" (UniqueName: \"kubernetes.io/projected/fb2950a4-c31b-47fd-bc69-84015e5e58c5-kube-api-access-z5f2d\") pod \"olm-operator-6b444d44fb-m587k\" (UID: \"fb2950a4-c31b-47fd-bc69-84015e5e58c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.407191 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.411394 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.412076 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.421083 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.422829 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgch\" (UniqueName: \"kubernetes.io/projected/8ee7285a-6ddc-4b48-a89b-a6692ba95ce6-kube-api-access-zrgch\") pod \"package-server-manager-789f6589d5-jxcr7\" (UID: \"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.423580 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.433702 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.441784 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.442241 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.456394 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69z8\" (UniqueName: \"kubernetes.io/projected/0b04ea21-e24d-4d1c-861e-28746c304f7d-kube-api-access-g69z8\") pod \"router-default-5444994796-99z72\" (UID: \"0b04ea21-e24d-4d1c-861e-28746c304f7d\") " pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.467833 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.467964 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.468518 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.968493038 +0000 UTC m=+100.683546436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.470492 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.471816 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:48.971799762 +0000 UTC m=+100.686853150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.476609 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhd4\" (UniqueName: \"kubernetes.io/projected/2cfc7a17-4e42-4b20-b802-ed50cc19d89d-kube-api-access-gbhd4\") pod \"machine-config-server-s5jn2\" (UID: \"2cfc7a17-4e42-4b20-b802-ed50cc19d89d\") " pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.478561 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.478861 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkgqq\" (UniqueName: \"kubernetes.io/projected/55cea12f-f7be-4dde-9f2a-906005391f52-kube-api-access-jkgqq\") pod \"service-ca-9c57cc56f-qrfc4\" (UID: \"55cea12f-f7be-4dde-9f2a-906005391f52\") " pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.487936 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.497127 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.505072 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.512856 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5fwjj" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.518241 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588hb\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-kube-api-access-588hb\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.526613 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.529381 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s5jn2" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.558494 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.558881 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137638ac-cb39-4dc4-b21a-93f89b9297b6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f7qtd\" (UID: \"137638ac-cb39-4dc4-b21a-93f89b9297b6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.572156 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.572528 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.072500089 +0000 UTC m=+100.787553477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.657810 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.665294 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.675616 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.675982 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.175970686 +0000 UTC m=+100.891024074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.676796 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.689747 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.707489 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.727822 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9g4kq"] Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.776488 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.776914 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.27689924 +0000 UTC m=+100.991952628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.778314 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" event={"ID":"3f6c789d-43eb-48d0-aad4-cd6eef7bc706","Type":"ContainerStarted","Data":"175a31238ae0ffce897e11cc562b666c0a32bcf872e51074846df393d9ef97a6"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.782320 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" event={"ID":"2e14ca24-0be1-4cbc-bbdb-68331334706b","Type":"ContainerStarted","Data":"0ab5a24c2d4250c62e802513d403589eb75de4c00337069043001e4d5ee28466"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.783321 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8p28v" event={"ID":"75726254-6806-4c39-a565-f48ca0eb4fd3","Type":"ContainerStarted","Data":"39906fcb5c32e7d40b57b13cefc2c99d076496deb67a168b541e6760db06a235"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.784434 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klkbs" event={"ID":"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a","Type":"ContainerStarted","Data":"6f7d323a317ac13f09c9a64326052bfabe88c86a35236ba7aff9851ea904e88a"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.790865 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" event={"ID":"f60e974a-cd62-411f-9838-0f03704829fc","Type":"ContainerStarted","Data":"859ea575e7c473edcc3ec29309b82a4caf85d282712fe43df724c55e2aeda093"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.803021 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" event={"ID":"2ed7e624-4555-4dc2-85dc-cb3e305262dd","Type":"ContainerStarted","Data":"8af8195b6488aaa24e2a18493d3555169226eb37cca8d3196be22166bbae03e2"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.813923 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" event={"ID":"e10a4f9e-fd83-4951-bd62-6e274077d37d","Type":"ContainerStarted","Data":"45297aed3e445e75cda3d84f82a9a216321ba9d44bd1fd0099f23b0ddc5f9537"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.815162 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" event={"ID":"4b71d01f-9ec2-42f2-9271-822c00b5c142","Type":"ContainerStarted","Data":"9241e0a910f4f788e325be0111f022325a52e946d34aab5c53281aad930ba4aa"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.815912 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" event={"ID":"afc38731-aa1b-4009-a1f0-5877684d53f6","Type":"ContainerStarted","Data":"cdc823ee9ade14a29c3eb8b46549bd9ef36c99ee486954befabda955be339684"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.817995 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" event={"ID":"e95710e9-1583-407e-9cee-377d17a9c70d","Type":"ContainerStarted","Data":"77eba273aa494734466bb7999c352c12d3029a5281b5e4cacff86370aca7fae7"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.818187 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.820495 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" event={"ID":"e86ad86a-2294-4bd5-b065-f590c5d46c19","Type":"ContainerStarted","Data":"4fb884561b33e4ba92d33238e2ec67339bf0b51804fd7ca925bdc89e173fd69e"} Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.820594 4610 patch_prober.go:28] interesting pod/console-operator-58897d9998-xj96b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.820634 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xj96b" podUID="b70fef81-f099-4afe-b277-2418e2cd3d8e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.878742 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.879814 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.37979091 +0000 UTC m=+101.094844298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.989150 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:48 crc kubenswrapper[4610]: E1006 08:42:48.989916 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.489900525 +0000 UTC m=+101.204953913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:48 crc kubenswrapper[4610]: I1006 08:42:48.998202 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kjgjr"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.018493 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.103734 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.103980 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.603969233 +0000 UTC m=+101.319022621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.110269 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.110298 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.130396 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mm5ft"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.132185 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kxqjk"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.209149 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.209670 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.709650093 +0000 UTC m=+101.424703481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.310654 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.310915 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.810903956 +0000 UTC m=+101.525957344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.330122 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.415944 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.416092 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.916076931 +0000 UTC m=+101.631130309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.416556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.416785 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.916777601 +0000 UTC m=+101.631830989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.525807 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.526237 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.026221137 +0000 UTC m=+101.741274525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.608842 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.629392 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.629673 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.129661573 +0000 UTC m=+101.844714961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.732605 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.733072 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.232978185 +0000 UTC m=+101.948031573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.755837 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x8h4d"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.798639 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qrfc4"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.834647 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.834987 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.334976228 +0000 UTC m=+102.050029616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.850814 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" event={"ID":"b549d43d-f011-4c3b-9fd6-b3af936f56ed","Type":"ContainerStarted","Data":"422639473eeec82fea168a04d8f59fa6aa700bd737acc24ded3333aee22f7618"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.856910 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.862185 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" event={"ID":"dadf4701-13e0-4382-a658-2e2bd9d52ecb","Type":"ContainerStarted","Data":"5555b60ecbe34ff4797ea501579003d2317834492f1f14b2fa4c25dd2ed3ba9e"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.871888 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" event={"ID":"f8b408c0-509e-4ff0-9688-7d142ec0a14e","Type":"ContainerStarted","Data":"4bb40074c053ea855a47b80ab0d066600435293355d7220f5fc4c0be77f16f08"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.874127 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" event={"ID":"1293a8cf-7266-4bf1-bc49-b8369656484b","Type":"ContainerStarted","Data":"732c42b4e15b2538e79b4c43c43939875308f87fe29d413c9b682580b89c9f1c"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.882963 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" event={"ID":"afc38731-aa1b-4009-a1f0-5877684d53f6","Type":"ContainerStarted","Data":"83aa37e9e571f5666c9af6221d08ba83382093cf37e67b405a1efd1de4ccd4a3"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.884736 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" event={"ID":"db79ee81-c008-4374-9523-e762c47c9668","Type":"ContainerStarted","Data":"83ffb7783506d30d20ac639349dd4380617c351a1a47bef42ddc55039f27d83d"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.885963 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" event={"ID":"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0","Type":"ContainerStarted","Data":"a3d927d7b8273b4024b61f708eb5a8fcd9830c9bcb39603c532bc5efe716dfd4"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.887418 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" event={"ID":"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c","Type":"ContainerStarted","Data":"edc66e67c982a39f02fef028dc43eff87d06d100a9c2c4272e15b17e03cf60fd"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.894262 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" event={"ID":"6b1e8485-a57c-4aab-bf55-01fc322047cf","Type":"ContainerStarted","Data":"095d2c26ed2ae79338db923dd4a11f9d0d7c619a9e1c8bfb8a3beed579a5f561"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.897777 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-99z72" event={"ID":"0b04ea21-e24d-4d1c-861e-28746c304f7d","Type":"ContainerStarted","Data":"26f460adf851791e3aadb9f3a173ac1affd43e84874e0d606157aab37d21add7"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.906104 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.906183 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" event={"ID":"f60e974a-cd62-411f-9838-0f03704829fc","Type":"ContainerStarted","Data":"9cbab9b7df3edf97b5d4f55f48251208cde6d382a7d584641e28855e00d4d13f"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.906223 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k6pjc"] Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.906237 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" event={"ID":"2ed7e624-4555-4dc2-85dc-cb3e305262dd","Type":"ContainerStarted","Data":"d99947c75c8edd11edf1d5b84ce9caf019a8744ff8e2c3e878c10988ce0fc460"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.913776 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" event={"ID":"fb2950a4-c31b-47fd-bc69-84015e5e58c5","Type":"ContainerStarted","Data":"210fbdc5fc529299f4d4b63eba8213c0b565e710ee2b829d8c235184b3565569"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.926108 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" event={"ID":"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e","Type":"ContainerStarted","Data":"4b5105279b0284810a12e273d1fc98895039da45e77f5201a2e2219ed779d78c"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.928982 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" event={"ID":"5e6695a0-e257-46a6-9459-7b476baa633b","Type":"ContainerStarted","Data":"8e9dbfa2fa4b9d97c89acdb812ae405be06dacb0d8a855d100e3126d77a4e2a8"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.932754 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" event={"ID":"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1","Type":"ContainerStarted","Data":"0ab2304a3f436b6af5a0a1cafde6f8dbc0580a8810621a9caee6db66c96abd85"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.935706 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.935846 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.43583095 +0000 UTC m=+102.150884338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.936117 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:49 crc kubenswrapper[4610]: E1006 08:42:49.936498 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.436483459 +0000 UTC m=+102.151537007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.937793 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" event={"ID":"4b71d01f-9ec2-42f2-9271-822c00b5c142","Type":"ContainerStarted","Data":"db8ee9043ddbaaa43d33f4b1423e2d90b43e389939d76d7140650f7dd2880dbc"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.940236 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klkbs" event={"ID":"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a","Type":"ContainerStarted","Data":"f93a61903bd2ad8a310857d778c3a9bc0e8120961c34c9f70d3810f07b4874fa"} Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.941389 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.941724 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.941754 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.942244 4610 patch_prober.go:28] interesting pod/console-operator-58897d9998-xj96b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.942319 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xj96b" podUID="b70fef81-f099-4afe-b277-2418e2cd3d8e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 06 08:42:49 crc kubenswrapper[4610]: I1006 08:42:49.991950 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.025807 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.049923 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.051683 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.551472163 +0000 UTC m=+102.266525551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: W1006 08:42:50.057213 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137638ac_cb39_4dc4_b21a_93f89b9297b6.slice/crio-6e5a667b511b12d9abc8ad6eb72ff4f10985d5feac44a5b4e063146acf94454f WatchSource:0}: Error finding container 6e5a667b511b12d9abc8ad6eb72ff4f10985d5feac44a5b4e063146acf94454f: Status 404 returned error can't find the container with id 6e5a667b511b12d9abc8ad6eb72ff4f10985d5feac44a5b4e063146acf94454f Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.059393 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.059973 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.559959485 +0000 UTC m=+102.275012873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.075066 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.079261 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.084369 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:42:50 crc kubenswrapper[4610]: W1006 08:42:50.107660 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a502918_8f04_4208_b05c_d1d3fe1f7110.slice/crio-cd41ae052808c86e344732caed2b0424e9d8a2e64af2bb311fe7c5da4de5687a WatchSource:0}: Error finding container cd41ae052808c86e344732caed2b0424e9d8a2e64af2bb311fe7c5da4de5687a: Status 404 returned error can't find the container with id cd41ae052808c86e344732caed2b0424e9d8a2e64af2bb311fe7c5da4de5687a Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.131874 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5fwjj"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.167814 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.167966 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.66793579 +0000 UTC m=+102.382989188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.168177 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.168561 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.668545147 +0000 UTC m=+102.383598535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.198489 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" podStartSLOduration=76.198465799 podStartE2EDuration="1m16.198465799s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:50.192759136 +0000 UTC m=+101.907812524" watchObservedRunningTime="2025-10-06 08:42:50.198465799 +0000 UTC m=+101.913519187" Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.254093 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.254145 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.255136 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-klkbs" podStartSLOduration=76.255126192 podStartE2EDuration="1m16.255126192s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:50.246244649 +0000 UTC m=+101.961298047" watchObservedRunningTime="2025-10-06 08:42:50.255126192 +0000 UTC m=+101.970179580" Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.257231 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fjb8x"] Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.268752 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.269210 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.769193123 +0000 UTC m=+102.484246511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.294498 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" podStartSLOduration=76.294484133 podStartE2EDuration="1m16.294484133s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:50.287380701 +0000 UTC m=+102.002434089" watchObservedRunningTime="2025-10-06 08:42:50.294484133 +0000 UTC m=+102.009537521" Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.296773 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w"] Oct 06 08:42:50 crc kubenswrapper[4610]: W1006 08:42:50.299510 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b1b1162_9959_4680_b21e_078eb49dba97.slice/crio-727524b38f1caf7a0117c54b2ae50fad8cc677a38bb5ac0d6cea688109203632 WatchSource:0}: Error finding container 727524b38f1caf7a0117c54b2ae50fad8cc677a38bb5ac0d6cea688109203632: Status 404 returned error can't find the container with id 727524b38f1caf7a0117c54b2ae50fad8cc677a38bb5ac0d6cea688109203632 Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.314839 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xj96b" podStartSLOduration=76.314797602 podStartE2EDuration="1m16.314797602s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:50.314001669 +0000 UTC m=+102.029055087" watchObservedRunningTime="2025-10-06 08:42:50.314797602 +0000 UTC m=+102.029850990" Oct 06 08:42:50 crc kubenswrapper[4610]: W1006 08:42:50.369248 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4afea40_045a_4d72_92a9_f82cb6fe9cf3.slice/crio-cf69647ac33d60fbe0ea28fceb35778e393d51de38d77428efe1a60971e83d40 WatchSource:0}: Error finding container cf69647ac33d60fbe0ea28fceb35778e393d51de38d77428efe1a60971e83d40: Status 404 returned error can't find the container with id cf69647ac33d60fbe0ea28fceb35778e393d51de38d77428efe1a60971e83d40 Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.370136 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.370929 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.87091395 +0000 UTC m=+102.585967338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.472208 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.472629 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.972612806 +0000 UTC m=+102.687666194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.573722 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.574231 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.074211749 +0000 UTC m=+102.789265287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.674507 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.674638 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.174611878 +0000 UTC m=+102.889665296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.675486 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.675825 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.175813832 +0000 UTC m=+102.890867220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.776611 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.776844 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.276819388 +0000 UTC m=+102.991872776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.777032 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.777418 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.277406635 +0000 UTC m=+102.992460023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.878755 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.878921 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.378897265 +0000 UTC m=+103.093950653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.879130 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.879460 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.37944715 +0000 UTC m=+103.094500538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.963399 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" event={"ID":"c86e3a40-53eb-466d-ad93-268de0cb9e0d","Type":"ContainerStarted","Data":"2da18a09bb9f5e1f2d56624a55494653cab40ac243db110ed57f7ce73b7db7e3"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.964937 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" event={"ID":"9e5708d8-3116-4947-a74f-9551fbfdb501","Type":"ContainerStarted","Data":"e1cb76b16c639e9b962a39bd81a0cfc263cd7829647aa976a8162ce607983399"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.967171 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" event={"ID":"db79ee81-c008-4374-9523-e762c47c9668","Type":"ContainerStarted","Data":"26e8fe6963b7d774ef93634d7aea318dcaf4f3e1d3785f8b001350135e6d527e"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.973924 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" event={"ID":"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c","Type":"ContainerStarted","Data":"471a2f0833f231410609e2c7c9d711ab763019e88b8ccb4e6783cb84c9a7697c"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.975667 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" event={"ID":"1293a8cf-7266-4bf1-bc49-b8369656484b","Type":"ContainerStarted","Data":"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.976820 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" event={"ID":"55cea12f-f7be-4dde-9f2a-906005391f52","Type":"ContainerStarted","Data":"880face23a81a864d87f388151a5e15965fcb30f169a995a95db709c5f99a46b"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.978213 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" event={"ID":"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6","Type":"ContainerStarted","Data":"79688fbf99f55678158a696aa3e543abf7a86b3c12e5eeb66acfc6dbb34f1010"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.979701 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:50 crc kubenswrapper[4610]: E1006 08:42:50.980466 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.480452817 +0000 UTC m=+103.195506205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.982207 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" event={"ID":"6f2291f3-fb1c-4d23-9f78-59ef302b5c02","Type":"ContainerStarted","Data":"e85ec4582f8c3a02caff38aff87ef5cc3f1013bdeddb7f3241e284cdfc8c8e74"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.983172 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8p28v" event={"ID":"75726254-6806-4c39-a565-f48ca0eb4fd3","Type":"ContainerStarted","Data":"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.984300 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" event={"ID":"351aa4d4-e29f-4405-9985-5953396ca08e","Type":"ContainerStarted","Data":"7abdaf86814dd501a9dedeaef5d9a6b30cb1dd6019bc14ecaab0477db8af94a1"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.985169 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" event={"ID":"70fadbce-fb8b-4f7a-bbff-3036aba2a51e","Type":"ContainerStarted","Data":"abf53580e8df0a246d7665979de125abb4d702bf368085b2f875a1ed07364b7d"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.986950 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" event={"ID":"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0","Type":"ContainerStarted","Data":"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a"} Oct 06 08:42:50 crc kubenswrapper[4610]: I1006 08:42:50.999355 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" event={"ID":"2e14ca24-0be1-4cbc-bbdb-68331334706b","Type":"ContainerStarted","Data":"f68ddd464a5169d204f56d1b54deac1f5010365ee07fcfe41ef9baba449681fb"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.002346 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" event={"ID":"11417b4a-6ad4-44ea-9ba1-45a98d2fb619","Type":"ContainerStarted","Data":"225208d5ed42d7c4be4e5a153745f010644fe8b5fa1f375c837bcea02b931767"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.004984 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" event={"ID":"2ed7e624-4555-4dc2-85dc-cb3e305262dd","Type":"ContainerStarted","Data":"ddece357b2f9a3566f629cf68dae56f5fe69b74c5a9629f12e9117955afc7057"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.013555 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjb8x" event={"ID":"e4afea40-045a-4d72-92a9-f82cb6fe9cf3","Type":"ContainerStarted","Data":"cf69647ac33d60fbe0ea28fceb35778e393d51de38d77428efe1a60971e83d40"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.015263 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" event={"ID":"e86ad86a-2294-4bd5-b065-f590c5d46c19","Type":"ContainerStarted","Data":"a97fd285c1f43438e081ac89414fc2c636ef4d3f5e3f525f56ac6119f35b50db"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.019361 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s5jn2" event={"ID":"2cfc7a17-4e42-4b20-b802-ed50cc19d89d","Type":"ContainerStarted","Data":"821cfbe93f594febdd52b70a0259e7e245e33c5902c6615425446e5c78327a21"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.020179 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" event={"ID":"889b475b-a608-4aa3-ad0b-1824bd08b2a2","Type":"ContainerStarted","Data":"895101b326b811f8094e99d52dceeb2260e551c34d90f798292fe427661baab7"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.043252 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" event={"ID":"e10a4f9e-fd83-4951-bd62-6e274077d37d","Type":"ContainerStarted","Data":"3378e8b9de163f024871965cc493d2b3ca2e32784d6bc2e596b5a683f4d59554"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.057609 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" event={"ID":"5a502918-8f04-4208-b05c-d1d3fe1f7110","Type":"ContainerStarted","Data":"cd41ae052808c86e344732caed2b0424e9d8a2e64af2bb311fe7c5da4de5687a"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.069126 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5fwjj" event={"ID":"8b1b1162-9959-4680-b21e-078eb49dba97","Type":"ContainerStarted","Data":"727524b38f1caf7a0117c54b2ae50fad8cc677a38bb5ac0d6cea688109203632"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.070591 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" event={"ID":"137638ac-cb39-4dc4-b21a-93f89b9297b6","Type":"ContainerStarted","Data":"6e5a667b511b12d9abc8ad6eb72ff4f10985d5feac44a5b4e063146acf94454f"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.080977 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8p28v" podStartSLOduration=77.080947068 podStartE2EDuration="1m17.080947068s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:51.075883594 +0000 UTC m=+102.790937002" watchObservedRunningTime="2025-10-06 08:42:51.080947068 +0000 UTC m=+102.796000486" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.083806 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.085412 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.585401015 +0000 UTC m=+103.300454393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.103027 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" event={"ID":"c7b26f53-77fe-4ee8-a966-f95ad3dcaae1","Type":"ContainerStarted","Data":"26eabe3056e994951550d0f7bdf9446027ef6e83e809646924c91e175ef2703c"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.107852 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" event={"ID":"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b","Type":"ContainerStarted","Data":"33256de0479f4cbd1e2091fad3fefdc49341caf4efbacf3d15e952b711757050"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.135238 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" event={"ID":"13830283-fabe-488e-98a3-767df413452b","Type":"ContainerStarted","Data":"87bfa8b66f604442bc0a3eddcbec185c72d8cb61dc078e89113f612a5fd9a9a7"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.138513 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" event={"ID":"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f","Type":"ContainerStarted","Data":"4fe2179a93308ed6c2503476766af4d64997a171b2f6ee910904b52a43404c79"} Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.151155 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.151210 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.194298 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7vsnw" podStartSLOduration=77.194273185 podStartE2EDuration="1m17.194273185s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:51.171359503 +0000 UTC m=+102.886412891" watchObservedRunningTime="2025-10-06 08:42:51.194273185 +0000 UTC m=+102.909326593" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.194688 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.199201 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.699169885 +0000 UTC m=+103.414223283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.230944 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.232806 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.233614 4610 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-l4qt7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.233686 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" podUID="3f6c789d-43eb-48d0-aad4-cd6eef7bc706" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.301977 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.303840 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.803828925 +0000 UTC m=+103.518882313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.405877 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.406778 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:51.906752156 +0000 UTC m=+103.621805584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.507197 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.507479 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.007467824 +0000 UTC m=+103.722521212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.608004 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.608244 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.108213332 +0000 UTC m=+103.823266730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.608796 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.609195 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.10917824 +0000 UTC m=+103.824231628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.709298 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.709631 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.20961433 +0000 UTC m=+103.924667728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.810431 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.810796 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.310780121 +0000 UTC m=+104.025833509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:51 crc kubenswrapper[4610]: I1006 08:42:51.912388 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:51 crc kubenswrapper[4610]: E1006 08:42:51.912742 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.412723614 +0000 UTC m=+104.127777002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.014271 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.014633 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.514617915 +0000 UTC m=+104.229671303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.115783 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.116113 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.616097825 +0000 UTC m=+104.331151203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.144556 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" event={"ID":"dadf4701-13e0-4382-a658-2e2bd9d52ecb","Type":"ContainerStarted","Data":"c27f9fc21b043751c667e69639b08a2751957643704541ab36be006f594b7f4f"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.145797 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" event={"ID":"4c4276ab-f0c0-483a-96b6-78e45d3a8a2e","Type":"ContainerStarted","Data":"6ee3d6b5ba22748aad6829bfe715322ae4ead9ec05d37477466f2fe167ee4ae8"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.147094 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" event={"ID":"f60e974a-cd62-411f-9838-0f03704829fc","Type":"ContainerStarted","Data":"9085a7711dd9c00bfae2b50dc12d61b2856a088120171e63aa197a2ee038359b"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.149331 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" event={"ID":"6f2291f3-fb1c-4d23-9f78-59ef302b5c02","Type":"ContainerStarted","Data":"dda44c10233a7de7bbe4e18440e175fafc2a7afda4e162068e939d1e77031fa0"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.150648 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" event={"ID":"f7ab2fa4-d62b-4d2a-8425-2e66578fdf9b","Type":"ContainerStarted","Data":"43550118adb32eaa2e6b5c136efbbee289d74602c27b0e52a4c4bf2475517457"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.154005 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" event={"ID":"9e5708d8-3116-4947-a74f-9551fbfdb501","Type":"ContainerStarted","Data":"86ab066b15b860265b6f53576eef7f7ec9d3333c1b9e779042b46d75cbfe4e41"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.156484 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" event={"ID":"137638ac-cb39-4dc4-b21a-93f89b9297b6","Type":"ContainerStarted","Data":"a373eb3417e5d094780982acec45f755027d9392794186e2f694009713ea6bd8"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.160055 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" event={"ID":"11417b4a-6ad4-44ea-9ba1-45a98d2fb619","Type":"ContainerStarted","Data":"97960569ed00d5873f318d1b58a6813c9290ff9cb7e46a26bc7732e0e79ce991"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.162096 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" event={"ID":"55cea12f-f7be-4dde-9f2a-906005391f52","Type":"ContainerStarted","Data":"2c1d8b7dc66898abb82c7d65481ce690c1dd65e843b473fb744cf425d2710ed5"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.168854 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jf7qm" podStartSLOduration=78.168837207 podStartE2EDuration="1m18.168837207s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:51.193359659 +0000 UTC m=+102.908413047" watchObservedRunningTime="2025-10-06 08:42:52.168837207 +0000 UTC m=+103.883890595" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.174723 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" event={"ID":"db79ee81-c008-4374-9523-e762c47c9668","Type":"ContainerStarted","Data":"6bb996a0a33b11fb4a6c034d9d0bf716b2e2c9a9f2af0928c24c375d2199bcd4"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.187856 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf2pn" podStartSLOduration=78.187837098 podStartE2EDuration="1m18.187837098s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.166701356 +0000 UTC m=+103.881754744" watchObservedRunningTime="2025-10-06 08:42:52.187837098 +0000 UTC m=+103.902890496" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.196769 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s5jn2" event={"ID":"2cfc7a17-4e42-4b20-b802-ed50cc19d89d","Type":"ContainerStarted","Data":"1e44f497b15bd088ea27efdec7bd6d17fff3909a8e961c21b03a6dcecfa2e010"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.210109 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" event={"ID":"c86e3a40-53eb-466d-ad93-268de0cb9e0d","Type":"ContainerStarted","Data":"a233f992467c3543041b0bb8e8c18c77ee87834120d72fe8be980c33d2ca61bd"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.212008 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" event={"ID":"5a502918-8f04-4208-b05c-d1d3fe1f7110","Type":"ContainerStarted","Data":"ab93b1f9b69a98c0840dfa07c114e2a2565194035bda5efe06dd702ea6165116"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.212770 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8n2hw" podStartSLOduration=78.212757757 podStartE2EDuration="1m18.212757757s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.188671421 +0000 UTC m=+103.903724809" watchObservedRunningTime="2025-10-06 08:42:52.212757757 +0000 UTC m=+103.927811145" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.214999 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-crmz6" podStartSLOduration=78.214988591 podStartE2EDuration="1m18.214988591s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.212904792 +0000 UTC m=+103.927958180" watchObservedRunningTime="2025-10-06 08:42:52.214988591 +0000 UTC m=+103.930041999" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.217296 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.221151 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.721138546 +0000 UTC m=+104.436191934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.229262 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5fwjj" event={"ID":"8b1b1162-9959-4680-b21e-078eb49dba97","Type":"ContainerStarted","Data":"a2b55283ae6d18703d3a3e15ae44c402e139955b31ee74d8f1550f8f01d3614d"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.237348 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" event={"ID":"70fadbce-fb8b-4f7a-bbff-3036aba2a51e","Type":"ContainerStarted","Data":"5a4a9914e5ca972762e956976844a73f6f344b5e42174ae0fca4125685acc792"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.245630 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-s5jn2" podStartSLOduration=7.245613773 podStartE2EDuration="7.245613773s" podCreationTimestamp="2025-10-06 08:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.244309146 +0000 UTC m=+103.959362554" watchObservedRunningTime="2025-10-06 08:42:52.245613773 +0000 UTC m=+103.960667161" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.248155 4610 generic.go:334] "Generic (PLEG): container finished" podID="b549d43d-f011-4c3b-9fd6-b3af936f56ed" containerID="cb8f7dadacd48637ec5bbec43a083a195492c8310835bdccf9f2e3fac6af21b9" exitCode=0 Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.248238 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" event={"ID":"b549d43d-f011-4c3b-9fd6-b3af936f56ed","Type":"ContainerDied","Data":"cb8f7dadacd48637ec5bbec43a083a195492c8310835bdccf9f2e3fac6af21b9"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.249177 4610 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lzjcc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.249211 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" podUID="e95710e9-1583-407e-9cee-377d17a9c70d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.249208 4610 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lzjcc container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.249249 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" podUID="e95710e9-1583-407e-9cee-377d17a9c70d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.259540 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" event={"ID":"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6","Type":"ContainerStarted","Data":"0ba91a23914248e44114e6b58bc41a77a97b4731ae43112551643a452f3ae731"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.260955 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" event={"ID":"5e6695a0-e257-46a6-9459-7b476baa633b","Type":"ContainerStarted","Data":"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.261915 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.269188 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5fwjj" podStartSLOduration=7.269161763 podStartE2EDuration="7.269161763s" podCreationTimestamp="2025-10-06 08:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.266893959 +0000 UTC m=+103.981947357" watchObservedRunningTime="2025-10-06 08:42:52.269161763 +0000 UTC m=+103.984215151" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.273552 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" event={"ID":"13830283-fabe-488e-98a3-767df413452b","Type":"ContainerStarted","Data":"6cccc804c136975cd759ff8806d395cd73e72a1bda7c899ae6210fad4355bfed"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.275450 4610 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2ktvk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.275498 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.278017 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" event={"ID":"cc700e3e-8574-4d4d-bd15-ba8e0b21a51f","Type":"ContainerStarted","Data":"290cda45a3b73b49ea78e723c1f96380d59c35335c63b04623e0c0347b16ec61"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.280516 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" event={"ID":"6b1e8485-a57c-4aab-bf55-01fc322047cf","Type":"ContainerStarted","Data":"515ecd4ebb9c3a30db691c07e6f9de42ab34b844157680fd3b5ba5b0f80a36cc"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.290510 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" event={"ID":"889b475b-a608-4aa3-ad0b-1824bd08b2a2","Type":"ContainerStarted","Data":"8614fa09671b988ca3b3429048555e186a30fc45f0240ecc54b2c50e53d9ca5f"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.294563 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-99z72" event={"ID":"0b04ea21-e24d-4d1c-861e-28746c304f7d","Type":"ContainerStarted","Data":"f1e369c570560364f703165341b81376836ef0c122be52052d82b5197038209a"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.296868 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" event={"ID":"fb2950a4-c31b-47fd-bc69-84015e5e58c5","Type":"ContainerStarted","Data":"73098e0d45ffd8145a236f324ccaaf8d74cda3b1fa0b570630b8c1143e7d51f2"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.297982 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.301480 4610 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m587k container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.301547 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" podUID="fb2950a4-c31b-47fd-bc69-84015e5e58c5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.309193 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" event={"ID":"2e14ca24-0be1-4cbc-bbdb-68331334706b","Type":"ContainerStarted","Data":"bab9c02f5d8965d5413ae8f24a573ef73fc8ab1b23d8c10112994f01a856515d"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.311326 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjb8x" event={"ID":"e4afea40-045a-4d72-92a9-f82cb6fe9cf3","Type":"ContainerStarted","Data":"3338b93b24cb3fbb9aa4e225d6814027a2a9207bf514c87c81e062d84595d5b0"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.315222 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" event={"ID":"351aa4d4-e29f-4405-9985-5953396ca08e","Type":"ContainerStarted","Data":"9e142267c5637416a926e992a79727a9f44899e4e3abbe969cb4dccd143b0fdb"} Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.315707 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.315758 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.320482 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.323807 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.823779149 +0000 UTC m=+104.538832537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.364970 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podStartSLOduration=78.364944291 podStartE2EDuration="1m18.364944291s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.333010582 +0000 UTC m=+104.048063990" watchObservedRunningTime="2025-10-06 08:42:52.364944291 +0000 UTC m=+104.079997679" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.410500 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" podStartSLOduration=78.410461927 podStartE2EDuration="1m18.410461927s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.360973288 +0000 UTC m=+104.076026676" watchObservedRunningTime="2025-10-06 08:42:52.410461927 +0000 UTC m=+104.125515315" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.410876 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2pc2t" podStartSLOduration=78.410869329 podStartE2EDuration="1m18.410869329s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.409681175 +0000 UTC m=+104.124734583" watchObservedRunningTime="2025-10-06 08:42:52.410869329 +0000 UTC m=+104.125922717" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.437285 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.440697 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:52.940682258 +0000 UTC m=+104.655735646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.482362 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" podStartSLOduration=78.482346074 podStartE2EDuration="1m18.482346074s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.443243731 +0000 UTC m=+104.158297119" watchObservedRunningTime="2025-10-06 08:42:52.482346074 +0000 UTC m=+104.197399462" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.483082 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" podStartSLOduration=78.483076395 podStartE2EDuration="1m18.483076395s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.481350736 +0000 UTC m=+104.196404124" watchObservedRunningTime="2025-10-06 08:42:52.483076395 +0000 UTC m=+104.198129793" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.540775 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.541146 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.041124708 +0000 UTC m=+104.756178096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.543076 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sxnf" podStartSLOduration=78.543062183 podStartE2EDuration="1m18.543062183s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.5426145 +0000 UTC m=+104.257667888" watchObservedRunningTime="2025-10-06 08:42:52.543062183 +0000 UTC m=+104.258115571" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.544964 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j66hd" podStartSLOduration=78.544957757 podStartE2EDuration="1m18.544957757s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.512121402 +0000 UTC m=+104.227174790" watchObservedRunningTime="2025-10-06 08:42:52.544957757 +0000 UTC m=+104.260011135" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.592219 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-npdc2" podStartSLOduration=78.592190252 podStartE2EDuration="1m18.592190252s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.591679717 +0000 UTC m=+104.306733105" watchObservedRunningTime="2025-10-06 08:42:52.592190252 +0000 UTC m=+104.307243640" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.596282 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" podStartSLOduration=78.596274898 podStartE2EDuration="1m18.596274898s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:52.573989864 +0000 UTC m=+104.289043252" watchObservedRunningTime="2025-10-06 08:42:52.596274898 +0000 UTC m=+104.311328286" Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.642900 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.643544 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.143507223 +0000 UTC m=+104.858560601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.743608 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.743947 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.243932203 +0000 UTC m=+104.958985581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.844943 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.845348 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.34533134 +0000 UTC m=+105.060384718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.946474 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.946627 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.446606094 +0000 UTC m=+105.161659492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:52 crc kubenswrapper[4610]: I1006 08:42:52.946787 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:52 crc kubenswrapper[4610]: E1006 08:42:52.947125 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.447115749 +0000 UTC m=+105.162169137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.051014 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.051204 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.551175672 +0000 UTC m=+105.266229070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.051325 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.051410 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.052382 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.552366076 +0000 UTC m=+105.267419464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.058367 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a62060d4-5efa-4c4f-851d-8738476f690e-metrics-certs\") pod \"network-metrics-daemon-46wzl\" (UID: \"a62060d4-5efa-4c4f-851d-8738476f690e\") " pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.152743 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.153011 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.652985451 +0000 UTC m=+105.368038839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.153372 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.153664 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.65365186 +0000 UTC m=+105.368705238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.254798 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.255409 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.755395187 +0000 UTC m=+105.470448585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.294616 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-46wzl" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.323845 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" event={"ID":"5b4cf8c6-5eff-4d33-9e83-8643e75c0b7c","Type":"ContainerStarted","Data":"ee50f12fbde02b65b3a0b1c3662a66c9f4f5118db2cfecf3a6328caeb562491f"} Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.350250 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" event={"ID":"70fadbce-fb8b-4f7a-bbff-3036aba2a51e","Type":"ContainerStarted","Data":"56d10de486be0561f3b3e43bd752f645ab5c20a98188b16f7b5b7e75e16b3463"} Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.353874 4610 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m587k container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.353950 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" podUID="fb2950a4-c31b-47fd-bc69-84015e5e58c5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.355848 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.356729 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.856717701 +0000 UTC m=+105.571771079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.356916 4610 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2ktvk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.357076 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.376809 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7bcxq" podStartSLOduration=79.376796083 podStartE2EDuration="1m19.376796083s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.375392313 +0000 UTC m=+105.090445721" watchObservedRunningTime="2025-10-06 08:42:53.376796083 +0000 UTC m=+105.091849471" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.412956 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" podStartSLOduration=79.412936592 podStartE2EDuration="1m19.412936592s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.410418991 +0000 UTC m=+105.125472389" watchObservedRunningTime="2025-10-06 08:42:53.412936592 +0000 UTC m=+105.127990000" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.456446 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.456775 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6926w" podStartSLOduration=79.45676053 podStartE2EDuration="1m19.45676053s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.449110332 +0000 UTC m=+105.164163750" watchObservedRunningTime="2025-10-06 08:42:53.45676053 +0000 UTC m=+105.171813918" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.458224 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:53.958199521 +0000 UTC m=+105.673253079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.480348 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" podStartSLOduration=79.480330671 podStartE2EDuration="1m19.480330671s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.479857788 +0000 UTC m=+105.194911196" watchObservedRunningTime="2025-10-06 08:42:53.480330671 +0000 UTC m=+105.195384079" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.489329 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.539640 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-75clg" podStartSLOduration=79.53962207 podStartE2EDuration="1m19.53962207s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.516386718 +0000 UTC m=+105.231440126" watchObservedRunningTime="2025-10-06 08:42:53.53962207 +0000 UTC m=+105.254675448" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.559620 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.560184 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.060167325 +0000 UTC m=+105.775220713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.567278 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.567347 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.582736 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qrfc4" podStartSLOduration=79.582716187 podStartE2EDuration="1m19.582716187s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.580864294 +0000 UTC m=+105.295917692" watchObservedRunningTime="2025-10-06 08:42:53.582716187 +0000 UTC m=+105.297769585" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.609355 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n88h5" podStartSLOduration=79.609333135 podStartE2EDuration="1m19.609333135s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.608878032 +0000 UTC m=+105.323931430" watchObservedRunningTime="2025-10-06 08:42:53.609333135 +0000 UTC m=+105.324386533" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.631073 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hljsk" podStartSLOduration=79.631055973 podStartE2EDuration="1m19.631055973s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.628531081 +0000 UTC m=+105.343584489" watchObservedRunningTime="2025-10-06 08:42:53.631055973 +0000 UTC m=+105.346109361" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.656564 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" podStartSLOduration=79.656540949 podStartE2EDuration="1m19.656540949s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.65375702 +0000 UTC m=+105.368810428" watchObservedRunningTime="2025-10-06 08:42:53.656540949 +0000 UTC m=+105.371594347" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.661431 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.661970 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.161951053 +0000 UTC m=+105.877004451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.681795 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-99z72" podStartSLOduration=79.681778498 podStartE2EDuration="1m19.681778498s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.680308586 +0000 UTC m=+105.395361994" watchObservedRunningTime="2025-10-06 08:42:53.681778498 +0000 UTC m=+105.396831876" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.687868 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-46wzl"] Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.751870 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9g4kq" podStartSLOduration=79.751835113 podStartE2EDuration="1m19.751835113s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.700234873 +0000 UTC m=+105.415288261" watchObservedRunningTime="2025-10-06 08:42:53.751835113 +0000 UTC m=+105.466888501" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.753858 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k6pjc" podStartSLOduration=79.75385155 podStartE2EDuration="1m19.75385155s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.751548774 +0000 UTC m=+105.466602162" watchObservedRunningTime="2025-10-06 08:42:53.75385155 +0000 UTC m=+105.468904938" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.764161 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.764871 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.264858623 +0000 UTC m=+105.979912011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.784638 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mm5ft" podStartSLOduration=79.784622816 podStartE2EDuration="1m19.784622816s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:53.783393011 +0000 UTC m=+105.498446409" watchObservedRunningTime="2025-10-06 08:42:53.784622816 +0000 UTC m=+105.499676204" Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.865850 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.865983 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.365954892 +0000 UTC m=+106.081008280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.866248 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.866667 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.366651882 +0000 UTC m=+106.081705470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:53 crc kubenswrapper[4610]: I1006 08:42:53.966956 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:53 crc kubenswrapper[4610]: E1006 08:42:53.967287 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.467273197 +0000 UTC m=+106.182326585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.068523 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.068881 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.56886646 +0000 UTC m=+106.283919848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.170089 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.170265 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.670240827 +0000 UTC m=+106.385294215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.170577 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.170918 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.670908726 +0000 UTC m=+106.385962114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.272272 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.272512 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.772482958 +0000 UTC m=+106.487536346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.273017 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.273459 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.773442546 +0000 UTC m=+106.488495934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.357581 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" event={"ID":"8ee7285a-6ddc-4b48-a89b-a6692ba95ce6","Type":"ContainerStarted","Data":"7e7ee1612ba3836dbaf9737e6f1b482a0f50d232283a256676b62036e322346b"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.360005 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjb8x" event={"ID":"e4afea40-045a-4d72-92a9-f82cb6fe9cf3","Type":"ContainerStarted","Data":"7121698c6ab04d67f1b6149dce124ae72ea0da7e6fac08ca5d1db2ce77630997"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.361594 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" event={"ID":"9e5708d8-3116-4947-a74f-9551fbfdb501","Type":"ContainerStarted","Data":"0d2f0eb201c34502675eab424e233b012bca6938e3eedda66ffc085c24c81965"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.374527 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.374688 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.874666878 +0000 UTC m=+106.589720266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.374850 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.375184 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.875173292 +0000 UTC m=+106.590226670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.377220 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" event={"ID":"137638ac-cb39-4dc4-b21a-93f89b9297b6","Type":"ContainerStarted","Data":"5b3df8ad9fd43100ec8d550f43d60e030828e95c5387a50d99d7985b6ace0079"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.378636 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-46wzl" event={"ID":"a62060d4-5efa-4c4f-851d-8738476f690e","Type":"ContainerStarted","Data":"f5365e964bb5bbeaf2246bce093528e0d7f8b39816d1246e38eff42031228206"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.380952 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" event={"ID":"b549d43d-f011-4c3b-9fd6-b3af936f56ed","Type":"ContainerStarted","Data":"afeffd4846962127aa3e792bfac2f02aeefbbee5c1934c177c170d8bc6aba21b"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.382473 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" event={"ID":"5a502918-8f04-4208-b05c-d1d3fe1f7110","Type":"ContainerStarted","Data":"83d3c2c4a90f3198cc342260650e1d040fbbe60ea1eae7c4bc7c1a3811cd2b07"} Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.383247 4610 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2ktvk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.383292 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.414107 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5xmkk" podStartSLOduration=80.41407599 podStartE2EDuration="1m20.41407599s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:54.413515544 +0000 UTC m=+106.128568942" watchObservedRunningTime="2025-10-06 08:42:54.41407599 +0000 UTC m=+106.129129378" Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.414488 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f7qtd" podStartSLOduration=80.414482842 podStartE2EDuration="1m20.414482842s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:54.395415969 +0000 UTC m=+106.110469357" watchObservedRunningTime="2025-10-06 08:42:54.414482842 +0000 UTC m=+106.129536230" Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.475427 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.475694 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.975650854 +0000 UTC m=+106.690704242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.475805 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.476339 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:54.976311443 +0000 UTC m=+106.691364951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.490681 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.490756 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.576990 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.577206 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.077137414 +0000 UTC m=+106.792190802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.577323 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.577778 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.077761931 +0000 UTC m=+106.792815329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.678780 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.678983 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.178956443 +0000 UTC m=+106.894009831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.679162 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.679506 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.179490128 +0000 UTC m=+106.894543696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.780692 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.780873 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.280846844 +0000 UTC m=+106.995900232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.781075 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.781612 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.281597606 +0000 UTC m=+106.996651004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.882264 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.882575 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.3825439 +0000 UTC m=+107.097597288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:54 crc kubenswrapper[4610]: I1006 08:42:54.983695 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:54 crc kubenswrapper[4610]: E1006 08:42:54.984102 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.484081602 +0000 UTC m=+107.199135160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.084779 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.084960 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.584928543 +0000 UTC m=+107.299981931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.085075 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.085473 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.585458048 +0000 UTC m=+107.300511616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.186725 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.186904 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.686878456 +0000 UTC m=+107.401931844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.187075 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.187404 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.687395871 +0000 UTC m=+107.402449259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.288676 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.289351 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.789335994 +0000 UTC m=+107.504389382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.387457 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-46wzl" event={"ID":"a62060d4-5efa-4c4f-851d-8738476f690e","Type":"ContainerStarted","Data":"4d20855b9a2d7480fce48131baf76ee63c66a25490b43f312006893b3e0b9319"} Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.390772 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.391157 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.891144213 +0000 UTC m=+107.606197601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.489745 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.489812 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.492137 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.497515 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:55.997497162 +0000 UTC m=+107.712550550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.595793 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.596666 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.096653165 +0000 UTC m=+107.811706543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.698596 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.699174 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.199146724 +0000 UTC m=+107.914200112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.773695 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lzjcc" Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.792416 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x8h4d" podStartSLOduration=81.792394459 podStartE2EDuration="1m21.792394459s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:55.607923806 +0000 UTC m=+107.322977184" watchObservedRunningTime="2025-10-06 08:42:55.792394459 +0000 UTC m=+107.507447847" Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.799982 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.800306 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.300293804 +0000 UTC m=+108.015347192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.900893 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.901165 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.401131475 +0000 UTC m=+108.116184893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:55 crc kubenswrapper[4610]: I1006 08:42:55.901460 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:55 crc kubenswrapper[4610]: E1006 08:42:55.901858 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.401838586 +0000 UTC m=+108.116891984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.002655 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.002836 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.502809971 +0000 UTC m=+108.217863359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.002983 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.003338 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.503329975 +0000 UTC m=+108.218383363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.104392 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.104765 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.604731453 +0000 UTC m=+108.319784841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.104980 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.105321 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.60531266 +0000 UTC m=+108.320366038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.154279 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.154864 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.157888 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.157913 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.175829 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.206309 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.206699 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.706667066 +0000 UTC m=+108.421720454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.206748 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.206953 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.207118 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.207683 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.707673275 +0000 UTC m=+108.422726663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.287500 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xj96b" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.294257 4610 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-l4qt7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]log ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]etcd ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:42:56 crc kubenswrapper[4610]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/max-in-flight-filter ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 06 08:42:56 crc kubenswrapper[4610]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 06 08:42:56 crc kubenswrapper[4610]: livez check failed Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.294781 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" podUID="3f6c789d-43eb-48d0-aad4-cd6eef7bc706" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.307850 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.308013 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.807992401 +0000 UTC m=+108.523045789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.308551 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.308707 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.308843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.309142 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.809120843 +0000 UTC m=+108.524174231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.308704 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.342750 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.391211 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fjb8x" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.409912 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.411185 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:56.911166919 +0000 UTC m=+108.626220307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.425736 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" podStartSLOduration=82.425711013 podStartE2EDuration="1m22.425711013s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:56.408545744 +0000 UTC m=+108.123599132" watchObservedRunningTime="2025-10-06 08:42:56.425711013 +0000 UTC m=+108.140764411" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.448134 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fjb8x" podStartSLOduration=11.448115381 podStartE2EDuration="11.448115381s" podCreationTimestamp="2025-10-06 08:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:56.446819464 +0000 UTC m=+108.161872872" watchObservedRunningTime="2025-10-06 08:42:56.448115381 +0000 UTC m=+108.163168769" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.449131 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rjrwh" podStartSLOduration=82.44912453 podStartE2EDuration="1m22.44912453s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:56.42734256 +0000 UTC m=+108.142395958" watchObservedRunningTime="2025-10-06 08:42:56.44912453 +0000 UTC m=+108.164177918" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.479951 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.489993 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.490105 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.515581 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.517095 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.017072875 +0000 UTC m=+108.732126473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.617698 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.618196 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.118172504 +0000 UTC m=+108.833225892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.711943 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.719836 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.720305 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.220281441 +0000 UTC m=+108.935334839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.820867 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.821131 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.321088282 +0000 UTC m=+109.036141670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.821207 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.824014 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.323974934 +0000 UTC m=+109.039028322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.922613 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.922816 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.422784837 +0000 UTC m=+109.137838225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.923157 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:56 crc kubenswrapper[4610]: E1006 08:42:56.923447 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.423440135 +0000 UTC m=+109.138493523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:56 crc kubenswrapper[4610]: I1006 08:42:56.944741 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.024035 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.024606 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.524591626 +0000 UTC m=+109.239645004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.126165 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.126530 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.626519228 +0000 UTC m=+109.341572616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.226827 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.226931 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.726914517 +0000 UTC m=+109.441967905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.227106 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.227384 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.72737622 +0000 UTC m=+109.442429608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.329988 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.330258 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.830237589 +0000 UTC m=+109.545290997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.330446 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.330772 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.830762164 +0000 UTC m=+109.545815562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.397449 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" event={"ID":"b549d43d-f011-4c3b-9fd6-b3af936f56ed","Type":"ContainerStarted","Data":"cfcfb3b3577987abfd970250dd1480ecd0035baed8cab123b0854ae3db81ba1b"} Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.400063 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"565a815b-7507-4313-ad53-9007c4e307db","Type":"ContainerStarted","Data":"e25872212ed81bd6fe014c2c00e3a71eea8eff68d745cf3004a05e5eac31368e"} Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.431715 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.432273 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:57.932258194 +0000 UTC m=+109.647311582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.524067 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.524124 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.533670 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.534032 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.034019242 +0000 UTC m=+109.749072630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.635017 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.635277 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.135262475 +0000 UTC m=+109.850315863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.707394 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.707441 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.707760 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.707780 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.736842 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.737475 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.237463335 +0000 UTC m=+109.952516723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.839320 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.839463 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.339427359 +0000 UTC m=+110.054480747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.839822 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.840234 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.340222991 +0000 UTC m=+110.055276379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.876580 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.876615 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.878301 4610 patch_prober.go:28] interesting pod/console-f9d7485db-8p28v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.878354 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8p28v" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.941339 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.941512 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.441475575 +0000 UTC m=+110.156528963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.942856 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:57 crc kubenswrapper[4610]: E1006 08:42:57.943442 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.44343205 +0000 UTC m=+110.158485438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:57 crc kubenswrapper[4610]: I1006 08:42:57.959118 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.022375 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.023829 4610 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7qsjg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.023893 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.024537 4610 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7qsjg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.024567 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.043900 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.045971 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.5459526 +0000 UTC m=+110.261005998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.105370 4610 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2ktvk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.105431 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.149065 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.149428 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.649415926 +0000 UTC m=+110.364469304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.160173 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.250677 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.250871 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.750848384 +0000 UTC m=+110.465901772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.251431 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.252721 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.752712507 +0000 UTC m=+110.467765895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.353072 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.353335 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.853282061 +0000 UTC m=+110.568335459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.353622 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.353946 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.85392974 +0000 UTC m=+110.568983128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.429415 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" podStartSLOduration=84.429396629 podStartE2EDuration="1m24.429396629s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:58.426197458 +0000 UTC m=+110.141250846" watchObservedRunningTime="2025-10-06 08:42:58.429396629 +0000 UTC m=+110.144450017" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.443112 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.455463 4610 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t5gpw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.455709 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.456259 4610 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t5gpw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.456300 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.456328 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.456476 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.956456549 +0000 UTC m=+110.671509937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.456861 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.457896 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:58.95788002 +0000 UTC m=+110.672933408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.458107 4610 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t5gpw container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.458162 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.479339 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.480613 4610 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-27r27 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.480663 4610 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-27r27 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.480674 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" podUID="11417b4a-6ad4-44ea-9ba1-45a98d2fb619" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.480722 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" podUID="11417b4a-6ad4-44ea-9ba1-45a98d2fb619" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.481305 4610 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-27r27 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.481335 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" podUID="11417b4a-6ad4-44ea-9ba1-45a98d2fb619" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.488749 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.490623 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.490704 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.497958 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.505902 4610 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m587k container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.505997 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" podUID="fb2950a4-c31b-47fd-bc69-84015e5e58c5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.506241 4610 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m587k container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.506367 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" podUID="fb2950a4-c31b-47fd-bc69-84015e5e58c5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.557897 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.558175 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.058134725 +0000 UTC m=+110.773188113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.558819 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.560157 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.060133001 +0000 UTC m=+110.775186610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.660466 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.660785 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.160736696 +0000 UTC m=+110.875790094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.660912 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.661245 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.1612302 +0000 UTC m=+110.876283588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.761963 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.762082 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.262065952 +0000 UTC m=+110.977119340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.762380 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.762790 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.262764712 +0000 UTC m=+110.977818100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.863894 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.864062 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.364021975 +0000 UTC m=+111.079075353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.864505 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.865235 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.365205469 +0000 UTC m=+111.080258867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.918915 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.965714 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.966637 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.465943587 +0000 UTC m=+111.180996985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:58 crc kubenswrapper[4610]: I1006 08:42:58.968022 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:58 crc kubenswrapper[4610]: E1006 08:42:58.968680 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.468652074 +0000 UTC m=+111.183705462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.050632 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.051299 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.054951 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.055105 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.069821 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.073616 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.074211 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.57419206 +0000 UTC m=+111.289245448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.162858 4610 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7z998 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.162913 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" podUID="c7b26f53-77fe-4ee8-a966-f95ad3dcaae1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.163289 4610 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7z998 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.163388 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" podUID="c7b26f53-77fe-4ee8-a966-f95ad3dcaae1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.174981 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.175107 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.175185 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.175926 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.675912376 +0000 UTC m=+111.390965764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.276434 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.276594 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.276668 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.776630654 +0000 UTC m=+111.491684112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.276863 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.276901 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.276956 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.277243 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.777232351 +0000 UTC m=+111.492285739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.293781 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.365678 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.377589 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.377978 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.87796267 +0000 UTC m=+111.593016058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.415462 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" event={"ID":"f8b408c0-509e-4ff0-9688-7d142ec0a14e","Type":"ContainerStarted","Data":"534cb55294826f025f64c3e16274e08aff4c347fa83a431c8a45a48bfe491515"} Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.418981 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-46wzl" event={"ID":"a62060d4-5efa-4c4f-851d-8738476f690e","Type":"ContainerStarted","Data":"394ee04bf724605b1be228a8bb9e8ec2da5fe5f9741a061bd7f9e312f20bdd4f"} Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.420935 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"565a815b-7507-4313-ad53-9007c4e307db","Type":"ContainerStarted","Data":"558fb18c39963c6590fe708a39ea15e323c2b1d325a77f2cbabed8adfec3a359"} Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.442839 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-46wzl" podStartSLOduration=85.442821597 podStartE2EDuration="1m25.442821597s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:59.440610354 +0000 UTC m=+111.155663732" watchObservedRunningTime="2025-10-06 08:42:59.442821597 +0000 UTC m=+111.157874985" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.458310 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.458280627 podStartE2EDuration="3.458280627s" podCreationTimestamp="2025-10-06 08:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:59.457539806 +0000 UTC m=+111.172593194" watchObservedRunningTime="2025-10-06 08:42:59.458280627 +0000 UTC m=+111.173334015" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.480228 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.481230 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:42:59.98120559 +0000 UTC m=+111.696258978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.529334 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:42:59 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:42:59 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:42:59 crc kubenswrapper[4610]: healthz check failed Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.529414 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.581228 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.581373 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.081346181 +0000 UTC m=+111.796399569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.581669 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.581941 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.081932058 +0000 UTC m=+111.796985446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.682376 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.682505 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.182471741 +0000 UTC m=+111.897525129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.682632 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.682940 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.182932174 +0000 UTC m=+111.897985752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.783216 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.783562 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.283546789 +0000 UTC m=+111.998600177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.878507 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.884841 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.885345 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.385331987 +0000 UTC m=+112.100385375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:42:59 crc kubenswrapper[4610]: W1006 08:42:59.888570 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6bf1e070_ebdd_4793_b691_abb48e4e426b.slice/crio-0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab WatchSource:0}: Error finding container 0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab: Status 404 returned error can't find the container with id 0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab Oct 06 08:42:59 crc kubenswrapper[4610]: I1006 08:42:59.985799 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:42:59 crc kubenswrapper[4610]: E1006 08:42:59.986246 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.486230091 +0000 UTC m=+112.201283479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.088068 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.088380 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.588366279 +0000 UTC m=+112.303419667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.163533 4610 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7z998 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded" start-of-body= Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.163586 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" podUID="c7b26f53-77fe-4ee8-a966-f95ad3dcaae1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded" Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.189489 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.189614 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.689595462 +0000 UTC m=+112.404648860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.190329 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.690322052 +0000 UTC m=+112.405375440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.190038 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.291732 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.291945 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.791911035 +0000 UTC m=+112.506964423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.292389 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.292696 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.792683637 +0000 UTC m=+112.507737025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.393892 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.394085 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.894063094 +0000 UTC m=+112.609116482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.394300 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.394574 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.894561498 +0000 UTC m=+112.609614886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.426222 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6bf1e070-ebdd-4793-b691-abb48e4e426b","Type":"ContainerStarted","Data":"0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab"} Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.427861 4610 generic.go:334] "Generic (PLEG): container finished" podID="565a815b-7507-4313-ad53-9007c4e307db" containerID="558fb18c39963c6590fe708a39ea15e323c2b1d325a77f2cbabed8adfec3a359" exitCode=0 Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.427949 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"565a815b-7507-4313-ad53-9007c4e307db","Type":"ContainerDied","Data":"558fb18c39963c6590fe708a39ea15e323c2b1d325a77f2cbabed8adfec3a359"} Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.492630 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:00 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:00 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:00 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.492722 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.495284 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.495429 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.995407479 +0000 UTC m=+112.710460877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.495543 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.495819 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:00.99581014 +0000 UTC m=+112.710863528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.596549 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.596769 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.096736184 +0000 UTC m=+112.811789572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.596896 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.597267 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.097259159 +0000 UTC m=+112.812312547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.697894 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.697983 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.197964997 +0000 UTC m=+112.913018385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.698264 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.698503 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.198495172 +0000 UTC m=+112.913548560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.799861 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.800262 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.300245039 +0000 UTC m=+113.015298427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:00 crc kubenswrapper[4610]: I1006 08:43:00.901806 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:00 crc kubenswrapper[4610]: E1006 08:43:00.902168 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.402153381 +0000 UTC m=+113.117206769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.002726 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.002925 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.50289874 +0000 UTC m=+113.217952128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.003163 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.003478 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.503463816 +0000 UTC m=+113.218517204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.104713 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.104890 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.604860193 +0000 UTC m=+113.319913581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.105034 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.105349 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.605337837 +0000 UTC m=+113.320391225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.206392 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.206715 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.706700813 +0000 UTC m=+113.421754201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.254723 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.305542 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l4qt7" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.307724 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.308612 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.808595085 +0000 UTC m=+113.523648653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.408710 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.409127 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:01.909112397 +0000 UTC m=+113.624165785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.433587 4610 generic.go:334] "Generic (PLEG): container finished" podID="6f2291f3-fb1c-4d23-9f78-59ef302b5c02" containerID="dda44c10233a7de7bbe4e18440e175fafc2a7afda4e162068e939d1e77031fa0" exitCode=0 Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.433638 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" event={"ID":"6f2291f3-fb1c-4d23-9f78-59ef302b5c02","Type":"ContainerDied","Data":"dda44c10233a7de7bbe4e18440e175fafc2a7afda4e162068e939d1e77031fa0"} Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.436540 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6bf1e070-ebdd-4793-b691-abb48e4e426b","Type":"ContainerStarted","Data":"f631701237f6d09a9017c2baa307a432454ce1d57819f570c5d6317ced483f83"} Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.499106 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:01 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:01 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:01 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.500624 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.503247 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.503228927 podStartE2EDuration="2.503228927s" podCreationTimestamp="2025-10-06 08:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:43:01.499853841 +0000 UTC m=+113.214907229" watchObservedRunningTime="2025-10-06 08:43:01.503228927 +0000 UTC m=+113.218282315" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.510491 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.511235 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.011220565 +0000 UTC m=+113.726273953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.612168 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.612716 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.112682174 +0000 UTC m=+113.827735552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.714427 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.714767 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.214756241 +0000 UTC m=+113.929809629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.773230 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.815794 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir\") pod \"565a815b-7507-4313-ad53-9007c4e307db\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.815859 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access\") pod \"565a815b-7507-4313-ad53-9007c4e307db\" (UID: \"565a815b-7507-4313-ad53-9007c4e307db\") " Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.816092 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "565a815b-7507-4313-ad53-9007c4e307db" (UID: "565a815b-7507-4313-ad53-9007c4e307db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.816148 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.816263 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.316247141 +0000 UTC m=+114.031300529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.816437 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.816734 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.316724944 +0000 UTC m=+114.031778332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.816820 4610 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565a815b-7507-4313-ad53-9007c4e307db-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.822137 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "565a815b-7507-4313-ad53-9007c4e307db" (UID: "565a815b-7507-4313-ad53-9007c4e307db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.918109 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.918236 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.418209814 +0000 UTC m=+114.133263202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.918455 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:01 crc kubenswrapper[4610]: I1006 08:43:01.918513 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565a815b-7507-4313-ad53-9007c4e307db-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:01 crc kubenswrapper[4610]: E1006 08:43:01.918777 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.41876466 +0000 UTC m=+114.133818048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.019658 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.019866 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.519838808 +0000 UTC m=+114.234892196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.020115 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.020424 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.520412714 +0000 UTC m=+114.235466102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.121631 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.121794 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.621767821 +0000 UTC m=+114.336821209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.121905 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.122282 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.622273425 +0000 UTC m=+114.337326813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.189120 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.189321 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565a815b-7507-4313-ad53-9007c4e307db" containerName="pruner" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.189333 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="565a815b-7507-4313-ad53-9007c4e307db" containerName="pruner" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.189429 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="565a815b-7507-4313-ad53-9007c4e307db" containerName="pruner" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.190241 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.196107 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.199401 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.223358 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.223561 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.723533848 +0000 UTC m=+114.438587236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.223597 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.223639 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2jp\" (UniqueName: \"kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.223679 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.223736 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.224209 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.724195267 +0000 UTC m=+114.439248645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.324794 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.325006 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.824980627 +0000 UTC m=+114.540034015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325117 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325185 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325211 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2jp\" (UniqueName: \"kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325235 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.325492 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.825481681 +0000 UTC m=+114.540535069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325626 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.325749 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.346814 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2jp\" (UniqueName: \"kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp\") pod \"community-operators-84vcq\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.396945 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.398076 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.401202 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.426788 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.427102 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:02.927085435 +0000 UTC m=+114.642138823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.441238 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"565a815b-7507-4313-ad53-9007c4e307db","Type":"ContainerDied","Data":"e25872212ed81bd6fe014c2c00e3a71eea8eff68d745cf3004a05e5eac31368e"} Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.441299 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e25872212ed81bd6fe014c2c00e3a71eea8eff68d745cf3004a05e5eac31368e" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.441381 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.465173 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.491713 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:02 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:02 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:02 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.491794 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.527803 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.527988 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.528115 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76tl\" (UniqueName: \"kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.528256 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.528623 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.028610036 +0000 UTC m=+114.743663424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.555501 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.595416 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.596374 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.632660 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633092 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633150 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633192 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633240 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633269 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b76tl\" (UniqueName: \"kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.633296 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-922nv\" (UniqueName: \"kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.633463 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.133445821 +0000 UTC m=+114.848499209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.641989 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.645381 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.646144 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.690095 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76tl\" (UniqueName: \"kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl\") pod \"certified-operators-h8dz4\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.727427 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.745501 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.745548 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.745580 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.745604 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-922nv\" (UniqueName: \"kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.746590 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.746822 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.246811749 +0000 UTC m=+114.961865137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.747160 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.803851 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-922nv\" (UniqueName: \"kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv\") pod \"community-operators-4prcm\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.821664 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.823164 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.828202 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.847872 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.848365 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.348349451 +0000 UTC m=+115.063402839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.922888 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.950825 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rhd\" (UniqueName: \"kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.950887 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.950930 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:02 crc kubenswrapper[4610]: I1006 08:43:02.950953 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:02 crc kubenswrapper[4610]: E1006 08:43:02.951253 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.45124111 +0000 UTC m=+115.166294498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.051752 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.051902 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.551876346 +0000 UTC m=+115.266929734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.052393 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.052422 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.052525 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rhd\" (UniqueName: \"kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.052575 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.052992 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.552966177 +0000 UTC m=+115.268019565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.068357 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.068436 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.069407 4610 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kxqjk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.069464 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" podUID="b549d43d-f011-4c3b-9fd6-b3af936f56ed" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.127513 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.155948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.156029 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.656014851 +0000 UTC m=+115.371068239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.157637 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.157954 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.657944866 +0000 UTC m=+115.372998254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.259292 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.259447 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.759426076 +0000 UTC m=+115.474479464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.259488 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.259794 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.759786536 +0000 UTC m=+115.474839924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.274804 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.360189 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.360363 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.86033752 +0000 UTC m=+115.575390908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.360464 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.360860 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.860848264 +0000 UTC m=+115.575901652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.440927 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.441172 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.441808 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rhd\" (UniqueName: \"kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd\") pod \"certified-operators-p7gdb\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.447262 4610 generic.go:334] "Generic (PLEG): container finished" podID="6bf1e070-ebdd-4793-b691-abb48e4e426b" containerID="f631701237f6d09a9017c2baa307a432454ce1d57819f570c5d6317ced483f83" exitCode=0 Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.447325 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6bf1e070-ebdd-4793-b691-abb48e4e426b","Type":"ContainerDied","Data":"f631701237f6d09a9017c2baa307a432454ce1d57819f570c5d6317ced483f83"} Oct 06 08:43:03 crc kubenswrapper[4610]: W1006 08:43:03.450166 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2dfe74_d5c9_4602_a697_cc40064871b9.slice/crio-535f755bed49471791a7b9d9aeb5a07fe6c6dc8b4a7737ccf916d7b91cc604ec WatchSource:0}: Error finding container 535f755bed49471791a7b9d9aeb5a07fe6c6dc8b4a7737ccf916d7b91cc604ec: Status 404 returned error can't find the container with id 535f755bed49471791a7b9d9aeb5a07fe6c6dc8b4a7737ccf916d7b91cc604ec Oct 06 08:43:03 crc kubenswrapper[4610]: W1006 08:43:03.450813 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb812be8d_d295_4878_9af8_c0387a655dbc.slice/crio-668515753e2e10ec64751921c4078775696dbe146a8eb0c060c0d9ea71755de6 WatchSource:0}: Error finding container 668515753e2e10ec64751921c4078775696dbe146a8eb0c060c0d9ea71755de6: Status 404 returned error can't find the container with id 668515753e2e10ec64751921c4078775696dbe146a8eb0c060c0d9ea71755de6 Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.457838 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.460893 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.460996 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.960972805 +0000 UTC m=+115.676026193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.460896 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.461239 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.461550 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:03.961537061 +0000 UTC m=+115.676590449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.492108 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:03 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:03 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:03 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.492162 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.541109 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fjb8x" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.571719 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume\") pod \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.571805 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8fzl\" (UniqueName: \"kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl\") pod \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.571836 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume\") pod \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\" (UID: \"6f2291f3-fb1c-4d23-9f78-59ef302b5c02\") " Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.571991 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.572503 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.072484331 +0000 UTC m=+115.787537719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.573771 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f2291f3-fb1c-4d23-9f78-59ef302b5c02" (UID: "6f2291f3-fb1c-4d23-9f78-59ef302b5c02"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.607074 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f2291f3-fb1c-4d23-9f78-59ef302b5c02" (UID: "6f2291f3-fb1c-4d23-9f78-59ef302b5c02"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.609099 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl" (OuterVolumeSpecName: "kube-api-access-b8fzl") pod "6f2291f3-fb1c-4d23-9f78-59ef302b5c02" (UID: "6f2291f3-fb1c-4d23-9f78-59ef302b5c02"). InnerVolumeSpecName "kube-api-access-b8fzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.673150 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.673211 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.673226 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8fzl\" (UniqueName: \"kubernetes.io/projected/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-kube-api-access-b8fzl\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.673240 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f2291f3-fb1c-4d23-9f78-59ef302b5c02-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.673476 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.173465446 +0000 UTC m=+115.888518834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.774135 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.774544 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.274526974 +0000 UTC m=+115.989580362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.842354 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:43:03 crc kubenswrapper[4610]: W1006 08:43:03.853115 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60cbdd2_6ec0_45e3_bcd9_a6a100c75f30.slice/crio-6ab51236149efcc07f211b563925c6ec75a4a578b0dad44b1417636288934f6e WatchSource:0}: Error finding container 6ab51236149efcc07f211b563925c6ec75a4a578b0dad44b1417636288934f6e: Status 404 returned error can't find the container with id 6ab51236149efcc07f211b563925c6ec75a4a578b0dad44b1417636288934f6e Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.876304 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.876687 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.376670333 +0000 UTC m=+116.091723721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.978240 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.978497 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.478448851 +0000 UTC m=+116.193502239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:03 crc kubenswrapper[4610]: I1006 08:43:03.979005 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:03 crc kubenswrapper[4610]: E1006 08:43:03.979300 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.479288995 +0000 UTC m=+116.194342383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.079974 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.080145 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.580118615 +0000 UTC m=+116.295172003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.080639 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.080992 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.580982429 +0000 UTC m=+116.296036027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.129428 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.181394 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.181811 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.681759049 +0000 UTC m=+116.396812447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.182155 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.182539 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.682521031 +0000 UTC m=+116.397574419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.201122 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.201534 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2291f3-fb1c-4d23-9f78-59ef302b5c02" containerName="collect-profiles" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.201547 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2291f3-fb1c-4d23-9f78-59ef302b5c02" containerName="collect-profiles" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.201751 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2291f3-fb1c-4d23-9f78-59ef302b5c02" containerName="collect-profiles" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.203026 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.210231 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.227348 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.283365 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.283714 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.783697122 +0000 UTC m=+116.498750510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.283798 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pq28\" (UniqueName: \"kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.283884 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.283909 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.283939 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.284246 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.784239027 +0000 UTC m=+116.499292415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.385123 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.385438 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.385471 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pq28\" (UniqueName: \"kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.385556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.385693 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.885678426 +0000 UTC m=+116.600731814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.386061 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.386494 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.404858 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pq28\" (UniqueName: \"kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28\") pod \"redhat-marketplace-r5nfg\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.450847 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerStarted","Data":"6ab51236149efcc07f211b563925c6ec75a4a578b0dad44b1417636288934f6e"} Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.451848 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerStarted","Data":"535f755bed49471791a7b9d9aeb5a07fe6c6dc8b4a7737ccf916d7b91cc604ec"} Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.452773 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerStarted","Data":"a183f9dd7d7b6a0e75555e099379d2be077eefb370f1a5299cee140805cea6a5"} Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.455280 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerStarted","Data":"668515753e2e10ec64751921c4078775696dbe146a8eb0c060c0d9ea71755de6"} Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.456786 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.460150 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59" event={"ID":"6f2291f3-fb1c-4d23-9f78-59ef302b5c02","Type":"ContainerDied","Data":"e85ec4582f8c3a02caff38aff87ef5cc3f1013bdeddb7f3241e284cdfc8c8e74"} Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.460173 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e85ec4582f8c3a02caff38aff87ef5cc3f1013bdeddb7f3241e284cdfc8c8e74" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.486492 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.486948 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:04.986919049 +0000 UTC m=+116.701972437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.492194 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:04 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:04 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:04 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.492230 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.576869 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.587786 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.588139 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.08812345 +0000 UTC m=+116.803176828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.588517 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.588800 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.08879291 +0000 UTC m=+116.803846298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.600119 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.601534 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.648763 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.689455 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.689793 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf65r\" (UniqueName: \"kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.689843 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.689892 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.690067 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.190033382 +0000 UTC m=+116.905086770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.791825 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.791900 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.792003 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.792100 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf65r\" (UniqueName: \"kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.792460 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.792674 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.792764 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.292750597 +0000 UTC m=+117.007803985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.822068 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf65r\" (UniqueName: \"kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r\") pod \"redhat-marketplace-qqb7g\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.892951 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.893322 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.393308271 +0000 UTC m=+117.108361659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.930823 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.940185 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.995735 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access\") pod \"6bf1e070-ebdd-4793-b691-abb48e4e426b\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.995957 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir\") pod \"6bf1e070-ebdd-4793-b691-abb48e4e426b\" (UID: \"6bf1e070-ebdd-4793-b691-abb48e4e426b\") " Oct 06 08:43:04 crc kubenswrapper[4610]: I1006 08:43:04.996437 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:04 crc kubenswrapper[4610]: E1006 08:43:04.996927 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.496911371 +0000 UTC m=+117.211964759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.000786 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6bf1e070-ebdd-4793-b691-abb48e4e426b" (UID: "6bf1e070-ebdd-4793-b691-abb48e4e426b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.000830 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6bf1e070-ebdd-4793-b691-abb48e4e426b" (UID: "6bf1e070-ebdd-4793-b691-abb48e4e426b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.098606 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.099400 4610 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bf1e070-ebdd-4793-b691-abb48e4e426b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.099421 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bf1e070-ebdd-4793-b691-abb48e4e426b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.099604 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.599584965 +0000 UTC m=+117.314638353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.201544 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.201939 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.701921399 +0000 UTC m=+117.416974787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.292154 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:43:05 crc kubenswrapper[4610]: W1006 08:43:05.303658 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcf6d26_e4c1_45a2_bdf7_f6c5c8f9461e.slice/crio-77d6691cf38111f1c5eace19d93a46e57f92d15091f0ce29ccd2162509b990ec WatchSource:0}: Error finding container 77d6691cf38111f1c5eace19d93a46e57f92d15091f0ce29ccd2162509b990ec: Status 404 returned error can't find the container with id 77d6691cf38111f1c5eace19d93a46e57f92d15091f0ce29ccd2162509b990ec Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.304366 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.304740 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.804726976 +0000 UTC m=+117.519780364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.353676 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:43:05 crc kubenswrapper[4610]: W1006 08:43:05.361523 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8779507e_e1f3_45fe_999f_69d9ea563140.slice/crio-c281f036325255843fc6e813b3a246085e664078408f60eba819dcf369b27876 WatchSource:0}: Error finding container c281f036325255843fc6e813b3a246085e664078408f60eba819dcf369b27876: Status 404 returned error can't find the container with id c281f036325255843fc6e813b3a246085e664078408f60eba819dcf369b27876 Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.388761 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.389225 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf1e070-ebdd-4793-b691-abb48e4e426b" containerName="pruner" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.389313 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf1e070-ebdd-4793-b691-abb48e4e426b" containerName="pruner" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.389502 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf1e070-ebdd-4793-b691-abb48e4e426b" containerName="pruner" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.390410 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.408349 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.409179 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.409812 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.410406 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:05.910391965 +0000 UTC m=+117.625445353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.468037 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerStarted","Data":"a64a4f970a0bad7ea6ac834b621fa00ed867facbcdfd0530a24d7595d73cb69d"} Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.469003 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerStarted","Data":"c281f036325255843fc6e813b3a246085e664078408f60eba819dcf369b27876"} Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.471806 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.475899 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6bf1e070-ebdd-4793-b691-abb48e4e426b","Type":"ContainerDied","Data":"0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab"} Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.475950 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af8857b6ff58e52affe91a8db9cf9b2aa808188bc33f668427dd5823e9cb9ab" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.492256 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:05 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:05 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:05 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.492309 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.492461 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerStarted","Data":"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2"} Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.494202 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerStarted","Data":"77d6691cf38111f1c5eace19d93a46e57f92d15091f0ce29ccd2162509b990ec"} Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.511034 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.511314 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.011284298 +0000 UTC m=+117.726337686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.511374 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.511489 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.511716 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5m5g\" (UniqueName: \"kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.511769 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.011758012 +0000 UTC m=+117.726811400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.511789 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.613560 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.613587 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.113570091 +0000 UTC m=+117.828623479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614083 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614280 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5m5g\" (UniqueName: \"kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614328 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614495 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614815 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.614840 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.614930 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.114907289 +0000 UTC m=+117.829960837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.635389 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5m5g\" (UniqueName: \"kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g\") pod \"redhat-operators-2sz4l\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.716121 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.716588 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.216569934 +0000 UTC m=+117.931623342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.725925 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.797758 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.798961 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.808822 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.817460 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.817814 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.317800577 +0000 UTC m=+118.032853965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.918472 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.918629 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.418597647 +0000 UTC m=+118.133651045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.918915 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.918961 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk9fm\" (UniqueName: \"kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.918982 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:05 crc kubenswrapper[4610]: I1006 08:43:05.919006 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:05 crc kubenswrapper[4610]: E1006 08:43:05.919356 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.419330428 +0000 UTC m=+118.134383816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.019578 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.020191 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.520165019 +0000 UTC m=+118.235218407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020334 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020391 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk9fm\" (UniqueName: \"kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020423 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020449 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020796 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.020853 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.520834788 +0000 UTC m=+118.235888176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.020990 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.043356 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk9fm\" (UniqueName: \"kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm\") pod \"redhat-operators-5d4ht\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.067841 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.118128 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.123870 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.123997 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.623978255 +0000 UTC m=+118.339031643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.124438 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.124758 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.624748377 +0000 UTC m=+118.339801765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.226558 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.226788 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.726748632 +0000 UTC m=+118.441802020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.226898 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.227346 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.727335458 +0000 UTC m=+118.442389066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.332397 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.332860 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.832845173 +0000 UTC m=+118.547898561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.402100 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:43:06 crc kubenswrapper[4610]: W1006 08:43:06.410382 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb17a66b_9ac1_4751_941a_5d8a851c2f3a.slice/crio-d8efe993d4aacd115ca1782e4b952f8fbdd90dffa7a0d479a76309cffc0e845d WatchSource:0}: Error finding container d8efe993d4aacd115ca1782e4b952f8fbdd90dffa7a0d479a76309cffc0e845d: Status 404 returned error can't find the container with id d8efe993d4aacd115ca1782e4b952f8fbdd90dffa7a0d479a76309cffc0e845d Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.434294 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.434601 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:06.93458865 +0000 UTC m=+118.649642038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.492679 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:06 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:06 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:06 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.492900 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.501363 4610 generic.go:334] "Generic (PLEG): container finished" podID="b812be8d-d295-4878-9af8-c0387a655dbc" containerID="a64a4f970a0bad7ea6ac834b621fa00ed867facbcdfd0530a24d7595d73cb69d" exitCode=0 Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.501423 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerDied","Data":"a64a4f970a0bad7ea6ac834b621fa00ed867facbcdfd0530a24d7595d73cb69d"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.503843 4610 generic.go:334] "Generic (PLEG): container finished" podID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerID="8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf" exitCode=0 Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.503899 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerDied","Data":"8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.505458 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerID="e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2" exitCode=0 Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.505506 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerDied","Data":"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.507175 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerStarted","Data":"b8b05038f8da9907cc6f13254ab0051578b79c619cd8a7ebd1b8fd3ad36b738d"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.507579 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.509057 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerStarted","Data":"d8efe993d4aacd115ca1782e4b952f8fbdd90dffa7a0d479a76309cffc0e845d"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.511677 4610 generic.go:334] "Generic (PLEG): container finished" podID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerID="56167533afad85023ce9c48edf5810434312e17fd99f2ac8918dc6b2e7a2659c" exitCode=0 Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.511701 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerDied","Data":"56167533afad85023ce9c48edf5810434312e17fd99f2ac8918dc6b2e7a2659c"} Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.534780 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.534967 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.034946848 +0000 UTC m=+118.750000246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.535235 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.535524 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.035513154 +0000 UTC m=+118.750566542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.636553 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.636835 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.136809718 +0000 UTC m=+118.851863126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.636874 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.637988 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.137899489 +0000 UTC m=+118.852953037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.737924 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.738365 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.23834514 +0000 UTC m=+118.953398528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.839556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.839948 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.339930892 +0000 UTC m=+119.054984270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.941289 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.941518 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.441479814 +0000 UTC m=+119.156533202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:06 crc kubenswrapper[4610]: I1006 08:43:06.941684 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:06 crc kubenswrapper[4610]: E1006 08:43:06.942134 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.442125052 +0000 UTC m=+119.157178440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.042564 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.042754 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.542729927 +0000 UTC m=+119.257783305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.042920 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.043228 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.543220371 +0000 UTC m=+119.258273759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.143679 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.143857 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.643834596 +0000 UTC m=+119.358887994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.143938 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.144371 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.64431883 +0000 UTC m=+119.359372228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.244652 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.244783 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.74475643 +0000 UTC m=+119.459809828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.245185 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.245539 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.745527992 +0000 UTC m=+119.460581620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.346600 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.346818 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.846786686 +0000 UTC m=+119.561840074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.346956 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.347365 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.847356942 +0000 UTC m=+119.562410330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.447904 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.448034 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.948013588 +0000 UTC m=+119.663066996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.448477 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.448770 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:07.948760939 +0000 UTC m=+119.663814327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.492921 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:07 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:07 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:07 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.492983 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.527611 4610 generic.go:334] "Generic (PLEG): container finished" podID="fadb3764-e589-451f-8337-1c4a9bb988af" containerID="ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857" exitCode=0 Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.527679 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerDied","Data":"ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857"} Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.529558 4610 generic.go:334] "Generic (PLEG): container finished" podID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerID="810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e" exitCode=0 Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.529606 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerDied","Data":"810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e"} Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.531079 4610 generic.go:334] "Generic (PLEG): container finished" podID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerID="cbd5ddb32aecbfb1839030f532732c496ef52764bfa5419a3138655f181f6ced" exitCode=0 Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.531151 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerDied","Data":"cbd5ddb32aecbfb1839030f532732c496ef52764bfa5419a3138655f181f6ced"} Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.532845 4610 generic.go:334] "Generic (PLEG): container finished" podID="8779507e-e1f3-45fe-999f-69d9ea563140" containerID="9c1aa35c15e64c96cd2664ed266a6bf27e56c0b77de321e745b3f5b42e3d9a38" exitCode=0 Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.532968 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerDied","Data":"9c1aa35c15e64c96cd2664ed266a6bf27e56c0b77de321e745b3f5b42e3d9a38"} Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.551817 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.551999 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.051974358 +0000 UTC m=+119.767027736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.552502 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.553085 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.053062459 +0000 UTC m=+119.768115857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.654779 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.655064 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.155018622 +0000 UTC m=+119.870072010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.655427 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.656585 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.156576926 +0000 UTC m=+119.871630314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.707722 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.707745 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.707789 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.707793 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.756441 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.756627 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.256599494 +0000 UTC m=+119.971652882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.756860 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.757227 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.257178771 +0000 UTC m=+119.972232169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.857984 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.858265 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.358233388 +0000 UTC m=+120.073286776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.858373 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.858901 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.358889697 +0000 UTC m=+120.073943085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.877206 4610 patch_prober.go:28] interesting pod/console-f9d7485db-8p28v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.877247 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8p28v" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 06 08:43:07 crc kubenswrapper[4610]: I1006 08:43:07.959695 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:07 crc kubenswrapper[4610]: E1006 08:43:07.960009 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.459995896 +0000 UTC m=+120.175049284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.028890 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.061164 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.061487 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.561472806 +0000 UTC m=+120.276526194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.111082 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.163377 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.164589 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.664557261 +0000 UTC m=+120.379610759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.166973 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7z998" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.264706 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.266452 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.766419122 +0000 UTC m=+120.481472690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.366356 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.366500 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.866469611 +0000 UTC m=+120.581522999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.366929 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.368434 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.868421476 +0000 UTC m=+120.583474864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.447264 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.468323 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.468712 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:08.968699432 +0000 UTC m=+120.683752820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.502859 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-27r27" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.504292 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:08 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.504345 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.536685 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m587k" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.570992 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.571616 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.071602522 +0000 UTC m=+120.786655910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.671787 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.671915 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.171893258 +0000 UTC m=+120.886946646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.672206 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.676146 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.176130349 +0000 UTC m=+120.891183737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.773452 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.773935 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.273918613 +0000 UTC m=+120.988972001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.875688 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.876150 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.376110843 +0000 UTC m=+121.091164231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.933242 4610 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kxqjk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]log ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]etcd ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 08:43:08 crc kubenswrapper[4610]: [-]poststarthook/max-in-flight-filter failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 06 08:43:08 crc kubenswrapper[4610]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/project.openshift.io-projectcache ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 06 08:43:08 crc kubenswrapper[4610]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 06 08:43:08 crc kubenswrapper[4610]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 08:43:08 crc kubenswrapper[4610]: livez check failed Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.933367 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" podUID="b549d43d-f011-4c3b-9fd6-b3af936f56ed" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.977336 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.977542 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.477522411 +0000 UTC m=+121.192575799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:08 crc kubenswrapper[4610]: I1006 08:43:08.977646 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:08 crc kubenswrapper[4610]: E1006 08:43:08.978035 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.478026705 +0000 UTC m=+121.193080093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.081799 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.081962 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.581937004 +0000 UTC m=+121.296990402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.082722 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.083077 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.583064386 +0000 UTC m=+121.298117774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.183598 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.183736 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.683716122 +0000 UTC m=+121.398769510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.183894 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.184177 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.684169405 +0000 UTC m=+121.399222793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.285261 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.285431 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.785404298 +0000 UTC m=+121.500457686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.285692 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.285998 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.785985865 +0000 UTC m=+121.501039253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.387370 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.387948 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.887933768 +0000 UTC m=+121.602987156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.489437 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.491370 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:09.991356773 +0000 UTC m=+121.706410161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.491562 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:09 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:09 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:09 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.491607 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.563411 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" event={"ID":"f8b408c0-509e-4ff0-9688-7d142ec0a14e","Type":"ContainerStarted","Data":"c7f1dcb8e6495beeeb0ea15f97c5e374629be38021e4dfe6a8076c27f642c52a"} Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.563459 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" event={"ID":"f8b408c0-509e-4ff0-9688-7d142ec0a14e","Type":"ContainerStarted","Data":"a196dbaf4827a288b9547d05d5f5af2daf4af56fea4be068e67eab542232e4b4"} Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.590983 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.591210 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.091176355 +0000 UTC m=+121.806229733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.591781 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.592100 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.092092561 +0000 UTC m=+121.807145949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.602140 4610 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.692609 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.693115 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.193098847 +0000 UTC m=+121.908152235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.794680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.795017 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.295006159 +0000 UTC m=+122.010059547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.895264 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.395240594 +0000 UTC m=+122.110293982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.896026 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.896417 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.896724 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.396711916 +0000 UTC m=+122.111765304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.997683 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.998007 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.497986849 +0000 UTC m=+122.213040237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:09 crc kubenswrapper[4610]: I1006 08:43:09.998336 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:09 crc kubenswrapper[4610]: E1006 08:43:09.998650 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.498641948 +0000 UTC m=+122.213695326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.099502 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:10 crc kubenswrapper[4610]: E1006 08:43:10.100103 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.600065746 +0000 UTC m=+122.315119134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.109209 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:10 crc kubenswrapper[4610]: E1006 08:43:10.109667 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:43:10.609646569 +0000 UTC m=+122.324699957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbjqb" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.205816 4610 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T08:43:09.602159248Z","Handler":null,"Name":""} Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.209375 4610 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.209415 4610 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.209787 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.215255 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.311377 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.347971 4610 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.348010 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.383553 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbjqb\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.491212 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:10 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:10 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:10 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.491268 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.542953 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.550304 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:10 crc kubenswrapper[4610]: I1006 08:43:10.577480 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" event={"ID":"f8b408c0-509e-4ff0-9688-7d142ec0a14e","Type":"ContainerStarted","Data":"e80c14b0da556e6e41b0a78ef3e2e77d6ebfd33c49a91ef1fb24d5c18c8727ad"} Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.118952 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.133267 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kjgjr" podStartSLOduration=26.133248967 podStartE2EDuration="26.133248967s" podCreationTimestamp="2025-10-06 08:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:43:10.604725927 +0000 UTC m=+122.319779315" watchObservedRunningTime="2025-10-06 08:43:11.133248967 +0000 UTC m=+122.848302355" Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.136248 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.492015 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:11 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:11 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:11 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.492097 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:11 crc kubenswrapper[4610]: I1006 08:43:11.593133 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" event={"ID":"be459a27-8ce8-4825-9b01-a89a33fb81d6","Type":"ContainerStarted","Data":"7edb562d8f185ea157088ab8cb21876b00e92b0ba1b09c5696bbd41a9a79d655"} Oct 06 08:43:12 crc kubenswrapper[4610]: I1006 08:43:12.491479 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:12 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:12 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:12 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:12 crc kubenswrapper[4610]: I1006 08:43:12.491565 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:12 crc kubenswrapper[4610]: I1006 08:43:12.603197 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" event={"ID":"be459a27-8ce8-4825-9b01-a89a33fb81d6","Type":"ContainerStarted","Data":"df59d0bedd459dda472f5003b0de7109316e71041337dff61c465b92cfe12df9"} Oct 06 08:43:12 crc kubenswrapper[4610]: I1006 08:43:12.603534 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:12 crc kubenswrapper[4610]: I1006 08:43:12.626434 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" podStartSLOduration=98.626418065 podStartE2EDuration="1m38.626418065s" podCreationTimestamp="2025-10-06 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:43:12.618637773 +0000 UTC m=+124.333691161" watchObservedRunningTime="2025-10-06 08:43:12.626418065 +0000 UTC m=+124.341471453" Oct 06 08:43:13 crc kubenswrapper[4610]: I1006 08:43:13.076997 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:43:13 crc kubenswrapper[4610]: I1006 08:43:13.082395 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kxqjk" Oct 06 08:43:13 crc kubenswrapper[4610]: I1006 08:43:13.490401 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:13 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:13 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:13 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:13 crc kubenswrapper[4610]: I1006 08:43:13.490462 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:14 crc kubenswrapper[4610]: I1006 08:43:14.490558 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:14 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:14 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:14 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:14 crc kubenswrapper[4610]: I1006 08:43:14.490651 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:15 crc kubenswrapper[4610]: I1006 08:43:15.491573 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:15 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:15 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:15 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:15 crc kubenswrapper[4610]: I1006 08:43:15.491699 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:16 crc kubenswrapper[4610]: I1006 08:43:16.491414 4610 patch_prober.go:28] interesting pod/router-default-5444994796-99z72 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:43:16 crc kubenswrapper[4610]: [-]has-synced failed: reason withheld Oct 06 08:43:16 crc kubenswrapper[4610]: [+]process-running ok Oct 06 08:43:16 crc kubenswrapper[4610]: healthz check failed Oct 06 08:43:16 crc kubenswrapper[4610]: I1006 08:43:16.491464 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-99z72" podUID="0b04ea21-e24d-4d1c-861e-28746c304f7d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.491317 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.499139 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-99z72" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708446 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708499 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708525 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708565 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708581 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.708998 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"f93a61903bd2ad8a310857d778c3a9bc0e8120961c34c9f70d3810f07b4874fa"} pod="openshift-console/downloads-7954f5f757-klkbs" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.709102 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" containerID="cri-o://f93a61903bd2ad8a310857d778c3a9bc0e8120961c34c9f70d3810f07b4874fa" gracePeriod=2 Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.710382 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.710406 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.876725 4610 patch_prober.go:28] interesting pod/console-f9d7485db-8p28v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 06 08:43:17 crc kubenswrapper[4610]: I1006 08:43:17.876789 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8p28v" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 06 08:43:19 crc kubenswrapper[4610]: I1006 08:43:19.663081 4610 generic.go:334] "Generic (PLEG): container finished" podID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerID="f93a61903bd2ad8a310857d778c3a9bc0e8120961c34c9f70d3810f07b4874fa" exitCode=0 Oct 06 08:43:19 crc kubenswrapper[4610]: I1006 08:43:19.663342 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klkbs" event={"ID":"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a","Type":"ContainerDied","Data":"f93a61903bd2ad8a310857d778c3a9bc0e8120961c34c9f70d3810f07b4874fa"} Oct 06 08:43:27 crc kubenswrapper[4610]: I1006 08:43:27.708422 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:27 crc kubenswrapper[4610]: I1006 08:43:27.709291 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:27 crc kubenswrapper[4610]: I1006 08:43:27.880562 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:43:27 crc kubenswrapper[4610]: I1006 08:43:27.884554 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:43:28 crc kubenswrapper[4610]: I1006 08:43:28.503887 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jxcr7" Oct 06 08:43:30 crc kubenswrapper[4610]: I1006 08:43:30.556813 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.094860 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.095394 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.095463 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.095530 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.097246 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.097500 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.099062 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.107190 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.113765 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.119148 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.119143 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.166285 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.182003 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.189586 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.385619 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.709915 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:37 crc kubenswrapper[4610]: I1006 08:43:37.709972 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:46 crc kubenswrapper[4610]: I1006 08:43:46.468961 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:43:46 crc kubenswrapper[4610]: I1006 08:43:46.469285 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:43:47 crc kubenswrapper[4610]: I1006 08:43:47.707344 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:47 crc kubenswrapper[4610]: I1006 08:43:47.707684 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:43:57 crc kubenswrapper[4610]: I1006 08:43:57.707576 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:43:57 crc kubenswrapper[4610]: I1006 08:43:57.708131 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:04 crc kubenswrapper[4610]: E1006 08:44:04.814019 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:44:04 crc kubenswrapper[4610]: E1006 08:44:04.815251 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kf65r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qqb7g_openshift-marketplace(8779507e-e1f3-45fe-999f-69d9ea563140): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\": context canceled" logger="UnhandledError" Oct 06 08:44:04 crc kubenswrapper[4610]: E1006 08:44:04.816647 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:afa798da1eea334bab3cb1e14451ff84f98d35b436cdc4b408b46e289e4e2bc2\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-qqb7g" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.227163 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.227350 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-922nv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4prcm_openshift-marketplace(d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.228585 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4prcm" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.334533 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qqb7g" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.462307 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.469807 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jp2jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-84vcq_openshift-marketplace(8c2dfe74-d5c9-4602-a697-cc40064871b9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:05 crc kubenswrapper[4610]: E1006 08:44:05.471002 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-84vcq" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" Oct 06 08:44:07 crc kubenswrapper[4610]: I1006 08:44:07.707415 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:07 crc kubenswrapper[4610]: I1006 08:44:07.707759 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:08 crc kubenswrapper[4610]: E1006 08:44:08.945647 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-84vcq" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" Oct 06 08:44:09 crc kubenswrapper[4610]: E1006 08:44:09.026657 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 06 08:44:09 crc kubenswrapper[4610]: E1006 08:44:09.026813 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5m5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2sz4l_openshift-marketplace(fadb3764-e589-451f-8337-1c4a9bb988af): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:09 crc kubenswrapper[4610]: E1006 08:44:09.028068 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2sz4l" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.440497 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2sz4l" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.657237 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.657422 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r9rhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p7gdb_openshift-marketplace(a4bb15af-cc88-4019-a8b5-7e1670842bc3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.658696 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p7gdb" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.808550 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.808704 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wk9fm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5d4ht_openshift-marketplace(cb17a66b-9ac1-4751-941a-5d8a851c2f3a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:11 crc kubenswrapper[4610]: E1006 08:44:11.809843 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5d4ht" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" Oct 06 08:44:16 crc kubenswrapper[4610]: I1006 08:44:16.469134 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:44:16 crc kubenswrapper[4610]: I1006 08:44:16.469647 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:44:16 crc kubenswrapper[4610]: E1006 08:44:16.802712 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5d4ht" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" Oct 06 08:44:16 crc kubenswrapper[4610]: E1006 08:44:16.842487 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 08:44:16 crc kubenswrapper[4610]: E1006 08:44:16.842657 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b76tl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h8dz4_openshift-marketplace(b812be8d-d295-4878-9af8-c0387a655dbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:16 crc kubenswrapper[4610]: E1006 08:44:16.843954 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h8dz4" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" Oct 06 08:44:17 crc kubenswrapper[4610]: I1006 08:44:17.707718 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:17 crc kubenswrapper[4610]: I1006 08:44:17.707769 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:17 crc kubenswrapper[4610]: E1006 08:44:17.988187 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h8dz4" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" Oct 06 08:44:18 crc kubenswrapper[4610]: E1006 08:44:18.106216 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:44:18 crc kubenswrapper[4610]: E1006 08:44:18.106552 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5pq28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-r5nfg_openshift-marketplace(cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:44:18 crc kubenswrapper[4610]: E1006 08:44:18.107717 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-r5nfg" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" Oct 06 08:44:18 crc kubenswrapper[4610]: W1006 08:44:18.499229 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-1bd250c4fa84b8118c010c15ad337fb0c619590871c972129b9861286118e93f WatchSource:0}: Error finding container 1bd250c4fa84b8118c010c15ad337fb0c619590871c972129b9861286118e93f: Status 404 returned error can't find the container with id 1bd250c4fa84b8118c010c15ad337fb0c619590871c972129b9861286118e93f Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.956748 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-klkbs" event={"ID":"0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a","Type":"ContainerStarted","Data":"ba22017efee7660c83f4ab6e2dcf963f541f8b80e2662c0e4033f9d351c10870"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.957299 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.957578 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.957641 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.959189 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f70613dc5112f63d8c3f5dedccb1bce1d664eafa0d63d2db096d8eeca361b25d"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.959245 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b2f9670c2086378d41bc623dbc7e70a7083ab0c4e4a41dbf992c1df63474fe1a"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.960245 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.962620 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a68b03b5303601495d2c2cd375e13896cc0d898d128e6036d5605263d2d1a895"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.962655 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f58d1bc5d237a632945583e2112b866dbccd5c8541ef9f9d066497a46a6201fc"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.965345 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c2b3a81fff0d57139e2ce4922656c698398afe97a0fc869783f454ca69ad748e"} Oct 06 08:44:18 crc kubenswrapper[4610]: I1006 08:44:18.965366 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1bd250c4fa84b8118c010c15ad337fb0c619590871c972129b9861286118e93f"} Oct 06 08:44:18 crc kubenswrapper[4610]: E1006 08:44:18.967005 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-r5nfg" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" Oct 06 08:44:19 crc kubenswrapper[4610]: I1006 08:44:19.969649 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:19 crc kubenswrapper[4610]: I1006 08:44:19.969713 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:21 crc kubenswrapper[4610]: I1006 08:44:21.980834 4610 generic.go:334] "Generic (PLEG): container finished" podID="8779507e-e1f3-45fe-999f-69d9ea563140" containerID="a2b32ccd29f894fd54c75e307d33e6226fabd717811c4bc72e4012d57abf9541" exitCode=0 Oct 06 08:44:21 crc kubenswrapper[4610]: I1006 08:44:21.980923 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerDied","Data":"a2b32ccd29f894fd54c75e307d33e6226fabd717811c4bc72e4012d57abf9541"} Oct 06 08:44:21 crc kubenswrapper[4610]: I1006 08:44:21.989024 4610 generic.go:334] "Generic (PLEG): container finished" podID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerID="7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4" exitCode=0 Oct 06 08:44:21 crc kubenswrapper[4610]: I1006 08:44:21.989281 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerDied","Data":"7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4"} Oct 06 08:44:22 crc kubenswrapper[4610]: I1006 08:44:22.996172 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerStarted","Data":"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd"} Oct 06 08:44:22 crc kubenswrapper[4610]: I1006 08:44:22.999009 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerStarted","Data":"aa3b6eae34a9e7b167e3bf34b8babcf05eb06a38cf0148e5720f03b6ac3e0d18"} Oct 06 08:44:23 crc kubenswrapper[4610]: I1006 08:44:23.031526 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4prcm" podStartSLOduration=5.73884358 podStartE2EDuration="1m21.031504537s" podCreationTimestamp="2025-10-06 08:43:02 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.534063408 +0000 UTC m=+119.249116796" lastFinishedPulling="2025-10-06 08:44:22.826724365 +0000 UTC m=+194.541777753" observedRunningTime="2025-10-06 08:44:23.014945115 +0000 UTC m=+194.729998523" watchObservedRunningTime="2025-10-06 08:44:23.031504537 +0000 UTC m=+194.746557925" Oct 06 08:44:23 crc kubenswrapper[4610]: I1006 08:44:23.034440 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qqb7g" podStartSLOduration=3.862875021 podStartE2EDuration="1m19.03441879s" podCreationTimestamp="2025-10-06 08:43:04 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.534396848 +0000 UTC m=+119.249450236" lastFinishedPulling="2025-10-06 08:44:22.705940587 +0000 UTC m=+194.420994005" observedRunningTime="2025-10-06 08:44:23.031332922 +0000 UTC m=+194.746386320" watchObservedRunningTime="2025-10-06 08:44:23.03441879 +0000 UTC m=+194.749472198" Oct 06 08:44:24 crc kubenswrapper[4610]: I1006 08:44:24.940973 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:24 crc kubenswrapper[4610]: I1006 08:44:24.941104 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:25 crc kubenswrapper[4610]: I1006 08:44:25.886652 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:27 crc kubenswrapper[4610]: I1006 08:44:27.707371 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:27 crc kubenswrapper[4610]: I1006 08:44:27.707431 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:27 crc kubenswrapper[4610]: I1006 08:44:27.707519 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:27 crc kubenswrapper[4610]: I1006 08:44:27.707449 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:32 crc kubenswrapper[4610]: I1006 08:44:32.924273 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:32 crc kubenswrapper[4610]: I1006 08:44:32.925217 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:32 crc kubenswrapper[4610]: I1006 08:44:32.966830 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:33 crc kubenswrapper[4610]: I1006 08:44:33.137809 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:34 crc kubenswrapper[4610]: I1006 08:44:34.251265 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.010851 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.061036 4610 generic.go:334] "Generic (PLEG): container finished" podID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerID="fc895675a0fb2e0c4cd235f66df0ff5409b8422fa94e5e76a993a44bc6406df8" exitCode=0 Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.061114 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerDied","Data":"fc895675a0fb2e0c4cd235f66df0ff5409b8422fa94e5e76a993a44bc6406df8"} Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.063735 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerID="994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4" exitCode=0 Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.063795 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerDied","Data":"994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4"} Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.063934 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4prcm" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="registry-server" containerID="cri-o://19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd" gracePeriod=2 Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.537681 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.734471 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content\") pod \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.734612 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities\") pod \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.734683 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-922nv\" (UniqueName: \"kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv\") pod \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\" (UID: \"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30\") " Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.735549 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities" (OuterVolumeSpecName: "utilities") pod "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" (UID: "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.739649 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv" (OuterVolumeSpecName: "kube-api-access-922nv") pod "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" (UID: "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30"). InnerVolumeSpecName "kube-api-access-922nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.802200 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" (UID: "d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.836624 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.836666 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-922nv\" (UniqueName: \"kubernetes.io/projected/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-kube-api-access-922nv\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:35 crc kubenswrapper[4610]: I1006 08:44:35.836680 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.071704 4610 generic.go:334] "Generic (PLEG): container finished" podID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerID="19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd" exitCode=0 Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.071742 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerDied","Data":"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd"} Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.071798 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4prcm" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.071885 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4prcm" event={"ID":"d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30","Type":"ContainerDied","Data":"6ab51236149efcc07f211b563925c6ec75a4a578b0dad44b1417636288934f6e"} Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.071909 4610 scope.go:117] "RemoveContainer" containerID="19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.116915 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.120840 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4prcm"] Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.122803 4610 scope.go:117] "RemoveContainer" containerID="7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.156394 4610 scope.go:117] "RemoveContainer" containerID="8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.204630 4610 scope.go:117] "RemoveContainer" containerID="19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd" Oct 06 08:44:36 crc kubenswrapper[4610]: E1006 08:44:36.205015 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd\": container with ID starting with 19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd not found: ID does not exist" containerID="19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.205059 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd"} err="failed to get container status \"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd\": rpc error: code = NotFound desc = could not find container \"19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd\": container with ID starting with 19a1d4d1659d7355a39ce732fc684188c4f5b55f0970e866d30bbe94b4adcdfd not found: ID does not exist" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.205084 4610 scope.go:117] "RemoveContainer" containerID="7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4" Oct 06 08:44:36 crc kubenswrapper[4610]: E1006 08:44:36.205872 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4\": container with ID starting with 7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4 not found: ID does not exist" containerID="7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.205915 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4"} err="failed to get container status \"7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4\": rpc error: code = NotFound desc = could not find container \"7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4\": container with ID starting with 7674149ba0dc33e8cc5b2454419fe995c495b26a8d8c5fa24a8ab5f1460c3da4 not found: ID does not exist" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.205942 4610 scope.go:117] "RemoveContainer" containerID="8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf" Oct 06 08:44:36 crc kubenswrapper[4610]: E1006 08:44:36.206193 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf\": container with ID starting with 8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf not found: ID does not exist" containerID="8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.206209 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf"} err="failed to get container status \"8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf\": rpc error: code = NotFound desc = could not find container \"8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf\": container with ID starting with 8cc418801e5abac6d97ea6a8ca3cab0c3f8110dc7f2ff7e00839ad68086e3abf not found: ID does not exist" Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.851259 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:44:36 crc kubenswrapper[4610]: I1006 08:44:36.851682 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qqb7g" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="registry-server" containerID="cri-o://aa3b6eae34a9e7b167e3bf34b8babcf05eb06a38cf0148e5720f03b6ac3e0d18" gracePeriod=2 Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.080243 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" path="/var/lib/kubelet/pods/d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30/volumes" Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.081919 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerStarted","Data":"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.084473 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerStarted","Data":"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.087028 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerStarted","Data":"80185bf3b4f0860c29052209dea3abc97c31526af19ab102692b886213675e34"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.088885 4610 generic.go:334] "Generic (PLEG): container finished" podID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerID="6727d94712f4ccf0542670e2320176c6780560924ce1df1a7c6181c0d75dd9da" exitCode=0 Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.088960 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerDied","Data":"6727d94712f4ccf0542670e2320176c6780560924ce1df1a7c6181c0d75dd9da"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.095069 4610 generic.go:334] "Generic (PLEG): container finished" podID="b812be8d-d295-4878-9af8-c0387a655dbc" containerID="299a83729aad744562c2fa14a67504640cd34cd6343504e40ee655f46b2b4b37" exitCode=0 Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.095124 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerDied","Data":"299a83729aad744562c2fa14a67504640cd34cd6343504e40ee655f46b2b4b37"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.099086 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerStarted","Data":"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66"} Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.148400 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7gdb" podStartSLOduration=6.443189035 podStartE2EDuration="1m35.148377688s" podCreationTimestamp="2025-10-06 08:43:02 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.535014785 +0000 UTC m=+119.250068173" lastFinishedPulling="2025-10-06 08:44:36.240203438 +0000 UTC m=+207.955256826" observedRunningTime="2025-10-06 08:44:37.143817167 +0000 UTC m=+208.858870575" watchObservedRunningTime="2025-10-06 08:44:37.148377688 +0000 UTC m=+208.863431076" Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.168444 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84vcq" podStartSLOduration=5.552856579 podStartE2EDuration="1m35.168429483s" podCreationTimestamp="2025-10-06 08:43:02 +0000 UTC" firstStartedPulling="2025-10-06 08:43:06.507249799 +0000 UTC m=+118.222303187" lastFinishedPulling="2025-10-06 08:44:36.122822703 +0000 UTC m=+207.837876091" observedRunningTime="2025-10-06 08:44:37.167163517 +0000 UTC m=+208.882216925" watchObservedRunningTime="2025-10-06 08:44:37.168429483 +0000 UTC m=+208.883482871" Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.707283 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.707335 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.707391 4610 patch_prober.go:28] interesting pod/downloads-7954f5f757-klkbs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:44:37 crc kubenswrapper[4610]: I1006 08:44:37.707483 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-klkbs" podUID="0f3e1eb2-9480-4350-8a62-c8c25f8dcc7a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.105303 4610 generic.go:334] "Generic (PLEG): container finished" podID="8779507e-e1f3-45fe-999f-69d9ea563140" containerID="aa3b6eae34a9e7b167e3bf34b8babcf05eb06a38cf0148e5720f03b6ac3e0d18" exitCode=0 Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.106222 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerDied","Data":"aa3b6eae34a9e7b167e3bf34b8babcf05eb06a38cf0148e5720f03b6ac3e0d18"} Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.107718 4610 generic.go:334] "Generic (PLEG): container finished" podID="fadb3764-e589-451f-8337-1c4a9bb988af" containerID="716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407" exitCode=0 Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.107756 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerDied","Data":"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407"} Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.109777 4610 generic.go:334] "Generic (PLEG): container finished" podID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerID="4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a" exitCode=0 Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.109799 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerDied","Data":"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a"} Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.375101 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.410556 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content\") pod \"8779507e-e1f3-45fe-999f-69d9ea563140\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.410657 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf65r\" (UniqueName: \"kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r\") pod \"8779507e-e1f3-45fe-999f-69d9ea563140\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.410715 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities\") pod \"8779507e-e1f3-45fe-999f-69d9ea563140\" (UID: \"8779507e-e1f3-45fe-999f-69d9ea563140\") " Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.411961 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities" (OuterVolumeSpecName: "utilities") pod "8779507e-e1f3-45fe-999f-69d9ea563140" (UID: "8779507e-e1f3-45fe-999f-69d9ea563140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.423312 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r" (OuterVolumeSpecName: "kube-api-access-kf65r") pod "8779507e-e1f3-45fe-999f-69d9ea563140" (UID: "8779507e-e1f3-45fe-999f-69d9ea563140"). InnerVolumeSpecName "kube-api-access-kf65r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.426185 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8779507e-e1f3-45fe-999f-69d9ea563140" (UID: "8779507e-e1f3-45fe-999f-69d9ea563140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.511433 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.511465 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8779507e-e1f3-45fe-999f-69d9ea563140-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:38 crc kubenswrapper[4610]: I1006 08:44:38.511478 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf65r\" (UniqueName: \"kubernetes.io/projected/8779507e-e1f3-45fe-999f-69d9ea563140-kube-api-access-kf65r\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.115681 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqb7g" event={"ID":"8779507e-e1f3-45fe-999f-69d9ea563140","Type":"ContainerDied","Data":"c281f036325255843fc6e813b3a246085e664078408f60eba819dcf369b27876"} Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.115767 4610 scope.go:117] "RemoveContainer" containerID="aa3b6eae34a9e7b167e3bf34b8babcf05eb06a38cf0148e5720f03b6ac3e0d18" Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.115791 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqb7g" Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.138932 4610 scope.go:117] "RemoveContainer" containerID="a2b32ccd29f894fd54c75e307d33e6226fabd717811c4bc72e4012d57abf9541" Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.144777 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:44:39 crc kubenswrapper[4610]: I1006 08:44:39.147307 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqb7g"] Oct 06 08:44:40 crc kubenswrapper[4610]: I1006 08:44:40.274944 4610 scope.go:117] "RemoveContainer" containerID="9c1aa35c15e64c96cd2664ed266a6bf27e56c0b77de321e745b3f5b42e3d9a38" Oct 06 08:44:41 crc kubenswrapper[4610]: I1006 08:44:41.090698 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" path="/var/lib/kubelet/pods/8779507e-e1f3-45fe-999f-69d9ea563140/volumes" Oct 06 08:44:42 crc kubenswrapper[4610]: I1006 08:44:42.556170 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:44:42 crc kubenswrapper[4610]: I1006 08:44:42.556210 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:44:42 crc kubenswrapper[4610]: I1006 08:44:42.608014 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:44:43 crc kubenswrapper[4610]: I1006 08:44:43.180609 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:44:43 crc kubenswrapper[4610]: I1006 08:44:43.458777 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:43 crc kubenswrapper[4610]: I1006 08:44:43.458837 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:43 crc kubenswrapper[4610]: I1006 08:44:43.501413 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:44 crc kubenswrapper[4610]: I1006 08:44:44.222139 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.252223 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.253136 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7gdb" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="registry-server" containerID="cri-o://80185bf3b4f0860c29052209dea3abc97c31526af19ab102692b886213675e34" gracePeriod=2 Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.469677 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.469740 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.469784 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.470407 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:44:46 crc kubenswrapper[4610]: I1006 08:44:46.470481 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99" gracePeriod=600 Oct 06 08:44:47 crc kubenswrapper[4610]: I1006 08:44:47.168103 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99" exitCode=0 Oct 06 08:44:47 crc kubenswrapper[4610]: I1006 08:44:47.168176 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99"} Oct 06 08:44:47 crc kubenswrapper[4610]: I1006 08:44:47.170945 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerStarted","Data":"68b8c6f605c526a41138896a1507177fe0b87ca94f1a5a5c3d4a931590e0ca16"} Oct 06 08:44:47 crc kubenswrapper[4610]: I1006 08:44:47.173092 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerStarted","Data":"b485b2ae9f6c1d7c8f36165f009edbd7a6b66479ad1395c82af6cb65c10148f4"} Oct 06 08:44:47 crc kubenswrapper[4610]: I1006 08:44:47.712803 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-klkbs" Oct 06 08:44:48 crc kubenswrapper[4610]: I1006 08:44:48.180030 4610 generic.go:334] "Generic (PLEG): container finished" podID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerID="80185bf3b4f0860c29052209dea3abc97c31526af19ab102692b886213675e34" exitCode=0 Oct 06 08:44:48 crc kubenswrapper[4610]: I1006 08:44:48.180086 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerDied","Data":"80185bf3b4f0860c29052209dea3abc97c31526af19ab102692b886213675e34"} Oct 06 08:44:48 crc kubenswrapper[4610]: I1006 08:44:48.199598 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5nfg" podStartSLOduration=5.715320863 podStartE2EDuration="1m44.199579484s" podCreationTimestamp="2025-10-06 08:43:04 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.531859146 +0000 UTC m=+119.246912534" lastFinishedPulling="2025-10-06 08:44:46.016117727 +0000 UTC m=+217.731171155" observedRunningTime="2025-10-06 08:44:48.196364982 +0000 UTC m=+219.911418370" watchObservedRunningTime="2025-10-06 08:44:48.199579484 +0000 UTC m=+219.914632872" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.127643 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.147162 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8dz4" podStartSLOduration=11.295432536 podStartE2EDuration="1m47.147143104s" podCreationTimestamp="2025-10-06 08:43:02 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.534098069 +0000 UTC m=+119.249151457" lastFinishedPulling="2025-10-06 08:44:43.385808637 +0000 UTC m=+215.100862025" observedRunningTime="2025-10-06 08:44:48.219697741 +0000 UTC m=+219.934751129" watchObservedRunningTime="2025-10-06 08:44:49.147143104 +0000 UTC m=+220.862196512" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.186681 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7gdb" event={"ID":"a4bb15af-cc88-4019-a8b5-7e1670842bc3","Type":"ContainerDied","Data":"a183f9dd7d7b6a0e75555e099379d2be077eefb370f1a5299cee140805cea6a5"} Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.186732 4610 scope.go:117] "RemoveContainer" containerID="80185bf3b4f0860c29052209dea3abc97c31526af19ab102692b886213675e34" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.186731 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7gdb" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.273842 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9rhd\" (UniqueName: \"kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd\") pod \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.273933 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities\") pod \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.273975 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content\") pod \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\" (UID: \"a4bb15af-cc88-4019-a8b5-7e1670842bc3\") " Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.275894 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities" (OuterVolumeSpecName: "utilities") pod "a4bb15af-cc88-4019-a8b5-7e1670842bc3" (UID: "a4bb15af-cc88-4019-a8b5-7e1670842bc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.281281 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd" (OuterVolumeSpecName: "kube-api-access-r9rhd") pod "a4bb15af-cc88-4019-a8b5-7e1670842bc3" (UID: "a4bb15af-cc88-4019-a8b5-7e1670842bc3"). InnerVolumeSpecName "kube-api-access-r9rhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.375783 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9rhd\" (UniqueName: \"kubernetes.io/projected/a4bb15af-cc88-4019-a8b5-7e1670842bc3-kube-api-access-r9rhd\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:49 crc kubenswrapper[4610]: I1006 08:44:49.375817 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:50 crc kubenswrapper[4610]: I1006 08:44:50.172870 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4bb15af-cc88-4019-a8b5-7e1670842bc3" (UID: "a4bb15af-cc88-4019-a8b5-7e1670842bc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:44:50 crc kubenswrapper[4610]: I1006 08:44:50.187577 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bb15af-cc88-4019-a8b5-7e1670842bc3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:44:50 crc kubenswrapper[4610]: I1006 08:44:50.385106 4610 scope.go:117] "RemoveContainer" containerID="fc895675a0fb2e0c4cd235f66df0ff5409b8422fa94e5e76a993a44bc6406df8" Oct 06 08:44:50 crc kubenswrapper[4610]: I1006 08:44:50.427816 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:44:50 crc kubenswrapper[4610]: I1006 08:44:50.430237 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7gdb"] Oct 06 08:44:51 crc kubenswrapper[4610]: I1006 08:44:51.077978 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" path="/var/lib/kubelet/pods/a4bb15af-cc88-4019-a8b5-7e1670842bc3/volumes" Oct 06 08:44:51 crc kubenswrapper[4610]: I1006 08:44:51.981295 4610 scope.go:117] "RemoveContainer" containerID="56167533afad85023ce9c48edf5810434312e17fd99f2ac8918dc6b2e7a2659c" Oct 06 08:44:52 crc kubenswrapper[4610]: I1006 08:44:52.729179 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:44:52 crc kubenswrapper[4610]: I1006 08:44:52.729469 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:44:52 crc kubenswrapper[4610]: I1006 08:44:52.770268 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.211787 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd"} Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.213520 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerStarted","Data":"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df"} Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.215377 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerStarted","Data":"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867"} Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.262610 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.278262 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5d4ht" podStartSLOduration=3.78888894 podStartE2EDuration="1m48.278242977s" podCreationTimestamp="2025-10-06 08:43:05 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.53094578 +0000 UTC m=+119.245999168" lastFinishedPulling="2025-10-06 08:44:52.020299817 +0000 UTC m=+223.735353205" observedRunningTime="2025-10-06 08:44:53.259016455 +0000 UTC m=+224.974069843" watchObservedRunningTime="2025-10-06 08:44:53.278242977 +0000 UTC m=+224.993296365" Oct 06 08:44:53 crc kubenswrapper[4610]: I1006 08:44:53.278656 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2sz4l" podStartSLOduration=3.763644605 podStartE2EDuration="1m48.278652148s" podCreationTimestamp="2025-10-06 08:43:05 +0000 UTC" firstStartedPulling="2025-10-06 08:43:07.530225219 +0000 UTC m=+119.245278597" lastFinishedPulling="2025-10-06 08:44:52.045232752 +0000 UTC m=+223.760286140" observedRunningTime="2025-10-06 08:44:53.276000482 +0000 UTC m=+224.991053870" watchObservedRunningTime="2025-10-06 08:44:53.278652148 +0000 UTC m=+224.993705536" Oct 06 08:44:54 crc kubenswrapper[4610]: I1006 08:44:54.577536 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:44:54 crc kubenswrapper[4610]: I1006 08:44:54.579247 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:44:54 crc kubenswrapper[4610]: I1006 08:44:54.629633 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:44:55 crc kubenswrapper[4610]: I1006 08:44:55.272843 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:44:55 crc kubenswrapper[4610]: I1006 08:44:55.726835 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:44:55 crc kubenswrapper[4610]: I1006 08:44:55.726903 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:44:56 crc kubenswrapper[4610]: I1006 08:44:56.118895 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:44:56 crc kubenswrapper[4610]: I1006 08:44:56.120452 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:44:56 crc kubenswrapper[4610]: I1006 08:44:56.768411 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2sz4l" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" probeResult="failure" output=< Oct 06 08:44:56 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 08:44:56 crc kubenswrapper[4610]: > Oct 06 08:44:57 crc kubenswrapper[4610]: I1006 08:44:57.164731 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5d4ht" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="registry-server" probeResult="failure" output=< Oct 06 08:44:57 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 08:44:57 crc kubenswrapper[4610]: > Oct 06 08:44:57 crc kubenswrapper[4610]: I1006 08:44:57.205829 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.142677 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4"] Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143073 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143085 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143095 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143101 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143110 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143116 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143125 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143131 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143140 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143146 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143153 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143158 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143168 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143173 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143183 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143188 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4610]: E1006 08:45:00.143198 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143203 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143286 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8779507e-e1f3-45fe-999f-69d9ea563140" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143297 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60cbdd2-6ec0-45e3-bcd9-a6a100c75f30" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143309 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4bb15af-cc88-4019-a8b5-7e1670842bc3" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.143646 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.145664 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.145851 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.150127 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.150214 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96tpd\" (UniqueName: \"kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.150315 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.160653 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4"] Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.251133 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.251219 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.251277 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96tpd\" (UniqueName: \"kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.252360 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.260035 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.283621 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96tpd\" (UniqueName: \"kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd\") pod \"collect-profiles-29329005-ccvc4\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.297608 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.461733 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:00 crc kubenswrapper[4610]: I1006 08:45:00.752029 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4"] Oct 06 08:45:01 crc kubenswrapper[4610]: I1006 08:45:01.259272 4610 generic.go:334] "Generic (PLEG): container finished" podID="000cae55-4bf2-468e-8cb6-f056257b1f2e" containerID="2a090a972808e399c9920e3f5be13003a440f02998819283ebe97ac6b56a90e7" exitCode=0 Oct 06 08:45:01 crc kubenswrapper[4610]: I1006 08:45:01.259521 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" event={"ID":"000cae55-4bf2-468e-8cb6-f056257b1f2e","Type":"ContainerDied","Data":"2a090a972808e399c9920e3f5be13003a440f02998819283ebe97ac6b56a90e7"} Oct 06 08:45:01 crc kubenswrapper[4610]: I1006 08:45:01.259553 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" event={"ID":"000cae55-4bf2-468e-8cb6-f056257b1f2e","Type":"ContainerStarted","Data":"e2ceabcf1d4546cece8539fc27493debd3d779ef43b9b37bb7ac6fcfa287a428"} Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.500169 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.680945 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96tpd\" (UniqueName: \"kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd\") pod \"000cae55-4bf2-468e-8cb6-f056257b1f2e\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.681314 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume\") pod \"000cae55-4bf2-468e-8cb6-f056257b1f2e\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.681350 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume\") pod \"000cae55-4bf2-468e-8cb6-f056257b1f2e\" (UID: \"000cae55-4bf2-468e-8cb6-f056257b1f2e\") " Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.681999 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "000cae55-4bf2-468e-8cb6-f056257b1f2e" (UID: "000cae55-4bf2-468e-8cb6-f056257b1f2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.689281 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "000cae55-4bf2-468e-8cb6-f056257b1f2e" (UID: "000cae55-4bf2-468e-8cb6-f056257b1f2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.689332 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd" (OuterVolumeSpecName: "kube-api-access-96tpd") pod "000cae55-4bf2-468e-8cb6-f056257b1f2e" (UID: "000cae55-4bf2-468e-8cb6-f056257b1f2e"). InnerVolumeSpecName "kube-api-access-96tpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.781971 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/000cae55-4bf2-468e-8cb6-f056257b1f2e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.782004 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000cae55-4bf2-468e-8cb6-f056257b1f2e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:02 crc kubenswrapper[4610]: I1006 08:45:02.782020 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96tpd\" (UniqueName: \"kubernetes.io/projected/000cae55-4bf2-468e-8cb6-f056257b1f2e-kube-api-access-96tpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4610]: I1006 08:45:03.272161 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" event={"ID":"000cae55-4bf2-468e-8cb6-f056257b1f2e","Type":"ContainerDied","Data":"e2ceabcf1d4546cece8539fc27493debd3d779ef43b9b37bb7ac6fcfa287a428"} Oct 06 08:45:03 crc kubenswrapper[4610]: I1006 08:45:03.272209 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ceabcf1d4546cece8539fc27493debd3d779ef43b9b37bb7ac6fcfa287a428" Oct 06 08:45:03 crc kubenswrapper[4610]: I1006 08:45:03.272254 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4" Oct 06 08:45:05 crc kubenswrapper[4610]: I1006 08:45:05.763162 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:45:05 crc kubenswrapper[4610]: I1006 08:45:05.800586 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:45:06 crc kubenswrapper[4610]: I1006 08:45:06.158487 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:45:06 crc kubenswrapper[4610]: I1006 08:45:06.196739 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:45:06 crc kubenswrapper[4610]: I1006 08:45:06.999489 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.298359 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5d4ht" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="registry-server" containerID="cri-o://3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867" gracePeriod=2 Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.680436 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.845156 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities\") pod \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.845218 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk9fm\" (UniqueName: \"kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm\") pod \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.845261 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content\") pod \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\" (UID: \"cb17a66b-9ac1-4751-941a-5d8a851c2f3a\") " Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.845998 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities" (OuterVolumeSpecName: "utilities") pod "cb17a66b-9ac1-4751-941a-5d8a851c2f3a" (UID: "cb17a66b-9ac1-4751-941a-5d8a851c2f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.846180 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.850887 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm" (OuterVolumeSpecName: "kube-api-access-wk9fm") pod "cb17a66b-9ac1-4751-941a-5d8a851c2f3a" (UID: "cb17a66b-9ac1-4751-941a-5d8a851c2f3a"). InnerVolumeSpecName "kube-api-access-wk9fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.923334 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb17a66b-9ac1-4751-941a-5d8a851c2f3a" (UID: "cb17a66b-9ac1-4751-941a-5d8a851c2f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.947886 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk9fm\" (UniqueName: \"kubernetes.io/projected/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-kube-api-access-wk9fm\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:07 crc kubenswrapper[4610]: I1006 08:45:07.948494 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb17a66b-9ac1-4751-941a-5d8a851c2f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.305309 4610 generic.go:334] "Generic (PLEG): container finished" podID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerID="3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867" exitCode=0 Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.305367 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4ht" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.305368 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerDied","Data":"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867"} Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.305951 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4ht" event={"ID":"cb17a66b-9ac1-4751-941a-5d8a851c2f3a","Type":"ContainerDied","Data":"d8efe993d4aacd115ca1782e4b952f8fbdd90dffa7a0d479a76309cffc0e845d"} Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.305976 4610 scope.go:117] "RemoveContainer" containerID="3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.322461 4610 scope.go:117] "RemoveContainer" containerID="4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.335501 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.345960 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5d4ht"] Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.357794 4610 scope.go:117] "RemoveContainer" containerID="810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.383593 4610 scope.go:117] "RemoveContainer" containerID="3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867" Oct 06 08:45:08 crc kubenswrapper[4610]: E1006 08:45:08.384613 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867\": container with ID starting with 3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867 not found: ID does not exist" containerID="3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.384794 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867"} err="failed to get container status \"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867\": rpc error: code = NotFound desc = could not find container \"3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867\": container with ID starting with 3bbd2099e4c03b2373af2ebf5f90ba94179c132e8758e164a9d3de416b2aa867 not found: ID does not exist" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.384930 4610 scope.go:117] "RemoveContainer" containerID="4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a" Oct 06 08:45:08 crc kubenswrapper[4610]: E1006 08:45:08.385418 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a\": container with ID starting with 4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a not found: ID does not exist" containerID="4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.385457 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a"} err="failed to get container status \"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a\": rpc error: code = NotFound desc = could not find container \"4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a\": container with ID starting with 4c64d9d90bf4a7d98aa6d50351497d820d356bbd5002421278662710dfe0e39a not found: ID does not exist" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.385483 4610 scope.go:117] "RemoveContainer" containerID="810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e" Oct 06 08:45:08 crc kubenswrapper[4610]: E1006 08:45:08.385742 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e\": container with ID starting with 810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e not found: ID does not exist" containerID="810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e" Oct 06 08:45:08 crc kubenswrapper[4610]: I1006 08:45:08.385780 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e"} err="failed to get container status \"810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e\": rpc error: code = NotFound desc = could not find container \"810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e\": container with ID starting with 810c6c5ea4ef64813c18451699f1b58109317e0a0bc866f083e15d20c3990b8e not found: ID does not exist" Oct 06 08:45:09 crc kubenswrapper[4610]: I1006 08:45:09.086965 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" path="/var/lib/kubelet/pods/cb17a66b-9ac1-4751-941a-5d8a851c2f3a/volumes" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.332623 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" podUID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" containerName="oauth-openshift" containerID="cri-o://ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a" gracePeriod=15 Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.787139 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.825617 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-575cc5b957-rp88j"] Oct 06 08:45:25 crc kubenswrapper[4610]: E1006 08:45:25.826222 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="extract-content" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826238 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="extract-content" Oct 06 08:45:25 crc kubenswrapper[4610]: E1006 08:45:25.826256 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000cae55-4bf2-468e-8cb6-f056257b1f2e" containerName="collect-profiles" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826263 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="000cae55-4bf2-468e-8cb6-f056257b1f2e" containerName="collect-profiles" Oct 06 08:45:25 crc kubenswrapper[4610]: E1006 08:45:25.826276 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="extract-utilities" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826283 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="extract-utilities" Oct 06 08:45:25 crc kubenswrapper[4610]: E1006 08:45:25.826290 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" containerName="oauth-openshift" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826296 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" containerName="oauth-openshift" Oct 06 08:45:25 crc kubenswrapper[4610]: E1006 08:45:25.826303 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="registry-server" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826309 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="registry-server" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826416 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" containerName="oauth-openshift" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826426 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="000cae55-4bf2-468e-8cb6-f056257b1f2e" containerName="collect-profiles" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826436 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb17a66b-9ac1-4751-941a-5d8a851c2f3a" containerName="registry-server" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.826846 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.840935 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-575cc5b957-rp88j"] Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.921546 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.921631 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.921691 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.921738 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.921776 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.922725 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.922800 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.923487 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.923374 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.923425 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924127 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmwf6\" (UniqueName: \"kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924537 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924589 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924615 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924654 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924672 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924695 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error\") pod \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\" (UID: \"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0\") " Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924774 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-cliconfig\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924812 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-dir\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924835 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrdd\" (UniqueName: \"kubernetes.io/projected/defa9a43-b0f4-46a6-b7e0-280d210d2e71-kube-api-access-hsrdd\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924852 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-policies\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924889 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-error\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924910 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-login\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924938 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.924989 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-serving-cert\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925009 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-service-ca\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925024 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-router-certs\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925088 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925115 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925153 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925178 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-session\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925230 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925244 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.925253 4610 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.926233 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.926770 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.928018 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.928208 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.929031 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6" (OuterVolumeSpecName: "kube-api-access-hmwf6") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "kube-api-access-hmwf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.930505 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.930723 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.931496 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.931663 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.934007 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:25 crc kubenswrapper[4610]: I1006 08:45:25.938579 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" (UID: "8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.025872 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-cliconfig\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.025931 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-dir\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.025953 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrdd\" (UniqueName: \"kubernetes.io/projected/defa9a43-b0f4-46a6-b7e0-280d210d2e71-kube-api-access-hsrdd\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.025971 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-policies\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.025991 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-error\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026010 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-login\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026039 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026078 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-serving-cert\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026093 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-service-ca\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026108 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-router-certs\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026135 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026160 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026177 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026198 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-session\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026232 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026242 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmwf6\" (UniqueName: \"kubernetes.io/projected/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-kube-api-access-hmwf6\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026252 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026261 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026271 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026280 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026288 4610 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026297 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026310 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026321 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026330 4610 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.026483 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-dir\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.027036 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-service-ca\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.027254 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-audit-policies\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.027802 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.029387 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-session\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.029387 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-serving-cert\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.029931 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-error\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.030384 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-router-certs\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.030421 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.030602 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.030708 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-login\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.032177 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-system-cliconfig\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.032764 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/defa9a43-b0f4-46a6-b7e0-280d210d2e71-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.044633 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrdd\" (UniqueName: \"kubernetes.io/projected/defa9a43-b0f4-46a6-b7e0-280d210d2e71-kube-api-access-hsrdd\") pod \"oauth-openshift-575cc5b957-rp88j\" (UID: \"defa9a43-b0f4-46a6-b7e0-280d210d2e71\") " pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.149887 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.344145 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-575cc5b957-rp88j"] Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.411190 4610 generic.go:334] "Generic (PLEG): container finished" podID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" containerID="ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a" exitCode=0 Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.411258 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" event={"ID":"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0","Type":"ContainerDied","Data":"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a"} Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.411773 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" event={"ID":"8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0","Type":"ContainerDied","Data":"a3d927d7b8273b4024b61f708eb5a8fcd9830c9bcb39603c532bc5efe716dfd4"} Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.411276 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mgm5v" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.411801 4610 scope.go:117] "RemoveContainer" containerID="ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.414903 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" event={"ID":"defa9a43-b0f4-46a6-b7e0-280d210d2e71","Type":"ContainerStarted","Data":"8fd7440d72471889f7d87ee4550756b38d9c4bb424f3d2636ffa4ea04e81ad42"} Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.439124 4610 scope.go:117] "RemoveContainer" containerID="ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a" Oct 06 08:45:26 crc kubenswrapper[4610]: E1006 08:45:26.439725 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a\": container with ID starting with ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a not found: ID does not exist" containerID="ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.439780 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a"} err="failed to get container status \"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a\": rpc error: code = NotFound desc = could not find container \"ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a\": container with ID starting with ce1574893ab1da27338f0f8365f104d02ec888c0e9c3a8f604bcb1bf560c7b8a not found: ID does not exist" Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.469599 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:45:26 crc kubenswrapper[4610]: I1006 08:45:26.473604 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mgm5v"] Oct 06 08:45:27 crc kubenswrapper[4610]: I1006 08:45:27.080986 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0" path="/var/lib/kubelet/pods/8fc0c6bd-fbc0-4666-83ac-7dff8e000bf0/volumes" Oct 06 08:45:27 crc kubenswrapper[4610]: I1006 08:45:27.424524 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" event={"ID":"defa9a43-b0f4-46a6-b7e0-280d210d2e71","Type":"ContainerStarted","Data":"d6b64b15c519b897cf8e11de6df00030711eb7ebdf8d85bbeebb1d236d8ed059"} Oct 06 08:45:27 crc kubenswrapper[4610]: I1006 08:45:27.424912 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:27 crc kubenswrapper[4610]: I1006 08:45:27.435039 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" Oct 06 08:45:27 crc kubenswrapper[4610]: I1006 08:45:27.452855 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-575cc5b957-rp88j" podStartSLOduration=27.452777655 podStartE2EDuration="27.452777655s" podCreationTimestamp="2025-10-06 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:45:27.451159989 +0000 UTC m=+259.166213477" watchObservedRunningTime="2025-10-06 08:45:27.452777655 +0000 UTC m=+259.167831053" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.363687 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.364492 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8dz4" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="registry-server" containerID="cri-o://b485b2ae9f6c1d7c8f36165f009edbd7a6b66479ad1395c82af6cb65c10148f4" gracePeriod=30 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.383360 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.397632 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.397881 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" containerID="cri-o://6cccc804c136975cd759ff8806d395cd73e72a1bda7c899ae6210fad4355bfed" gracePeriod=30 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.407625 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.408252 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r5nfg" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="registry-server" containerID="cri-o://68b8c6f605c526a41138896a1507177fe0b87ca94f1a5a5c3d4a931590e0ca16" gracePeriod=30 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.422147 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.422412 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2sz4l" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" containerID="cri-o://c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" gracePeriod=30 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.445175 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q9tl7"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.447550 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.459974 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q9tl7"] Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.566142 4610 generic.go:334] "Generic (PLEG): container finished" podID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerID="68b8c6f605c526a41138896a1507177fe0b87ca94f1a5a5c3d4a931590e0ca16" exitCode=0 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.566520 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerDied","Data":"68b8c6f605c526a41138896a1507177fe0b87ca94f1a5a5c3d4a931590e0ca16"} Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.576686 4610 generic.go:334] "Generic (PLEG): container finished" podID="b812be8d-d295-4878-9af8-c0387a655dbc" containerID="b485b2ae9f6c1d7c8f36165f009edbd7a6b66479ad1395c82af6cb65c10148f4" exitCode=0 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.576751 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerDied","Data":"b485b2ae9f6c1d7c8f36165f009edbd7a6b66479ad1395c82af6cb65c10148f4"} Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.579417 4610 generic.go:334] "Generic (PLEG): container finished" podID="13830283-fabe-488e-98a3-767df413452b" containerID="6cccc804c136975cd759ff8806d395cd73e72a1bda7c899ae6210fad4355bfed" exitCode=0 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.579643 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84vcq" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="registry-server" containerID="cri-o://37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66" gracePeriod=30 Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.579919 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" event={"ID":"13830283-fabe-488e-98a3-767df413452b","Type":"ContainerDied","Data":"6cccc804c136975cd759ff8806d395cd73e72a1bda7c899ae6210fad4355bfed"} Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.590331 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.590392 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.590458 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mgjh\" (UniqueName: \"kubernetes.io/projected/2f273f28-c469-4f2f-a0de-bad2dd1345cb-kube-api-access-5mgjh\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.693335 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.693407 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.693456 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mgjh\" (UniqueName: \"kubernetes.io/projected/2f273f28-c469-4f2f-a0de-bad2dd1345cb-kube-api-access-5mgjh\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.695154 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.699394 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2f273f28-c469-4f2f-a0de-bad2dd1345cb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.715529 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mgjh\" (UniqueName: \"kubernetes.io/projected/2f273f28-c469-4f2f-a0de-bad2dd1345cb-kube-api-access-5mgjh\") pod \"marketplace-operator-79b997595-q9tl7\" (UID: \"2f273f28-c469-4f2f-a0de-bad2dd1345cb\") " pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: E1006 08:45:45.731590 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df is running failed: container process not found" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:45:45 crc kubenswrapper[4610]: E1006 08:45:45.735595 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df is running failed: container process not found" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:45:45 crc kubenswrapper[4610]: E1006 08:45:45.736239 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df is running failed: container process not found" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" cmd=["grpc_health_probe","-addr=:50051"] Oct 06 08:45:45 crc kubenswrapper[4610]: E1006 08:45:45.736492 4610 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2sz4l" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.736773 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.742540 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.853475 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.868590 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.869115 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.902911 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content\") pod \"b812be8d-d295-4878-9af8-c0387a655dbc\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.903011 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b76tl\" (UniqueName: \"kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl\") pod \"b812be8d-d295-4878-9af8-c0387a655dbc\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.903144 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities\") pod \"b812be8d-d295-4878-9af8-c0387a655dbc\" (UID: \"b812be8d-d295-4878-9af8-c0387a655dbc\") " Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.904011 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities" (OuterVolumeSpecName: "utilities") pod "b812be8d-d295-4878-9af8-c0387a655dbc" (UID: "b812be8d-d295-4878-9af8-c0387a655dbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.931267 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl" (OuterVolumeSpecName: "kube-api-access-b76tl") pod "b812be8d-d295-4878-9af8-c0387a655dbc" (UID: "b812be8d-d295-4878-9af8-c0387a655dbc"). InnerVolumeSpecName "kube-api-access-b76tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:45 crc kubenswrapper[4610]: I1006 08:45:45.992521 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b812be8d-d295-4878-9af8-c0387a655dbc" (UID: "b812be8d-d295-4878-9af8-c0387a655dbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.010950 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics\") pod \"13830283-fabe-488e-98a3-767df413452b\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011670 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities\") pod \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011701 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content\") pod \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011737 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca\") pod \"13830283-fabe-488e-98a3-767df413452b\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011770 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities\") pod \"fadb3764-e589-451f-8337-1c4a9bb988af\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011801 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pq28\" (UniqueName: \"kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28\") pod \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\" (UID: \"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011832 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content\") pod \"fadb3764-e589-451f-8337-1c4a9bb988af\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011943 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5m5g\" (UniqueName: \"kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g\") pod \"fadb3764-e589-451f-8337-1c4a9bb988af\" (UID: \"fadb3764-e589-451f-8337-1c4a9bb988af\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.011967 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks5kz\" (UniqueName: \"kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz\") pod \"13830283-fabe-488e-98a3-767df413452b\" (UID: \"13830283-fabe-488e-98a3-767df413452b\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.012170 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.012188 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b812be8d-d295-4878-9af8-c0387a655dbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.012203 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b76tl\" (UniqueName: \"kubernetes.io/projected/b812be8d-d295-4878-9af8-c0387a655dbc-kube-api-access-b76tl\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.013648 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities" (OuterVolumeSpecName: "utilities") pod "fadb3764-e589-451f-8337-1c4a9bb988af" (UID: "fadb3764-e589-451f-8337-1c4a9bb988af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.014393 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities" (OuterVolumeSpecName: "utilities") pod "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" (UID: "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.016415 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "13830283-fabe-488e-98a3-767df413452b" (UID: "13830283-fabe-488e-98a3-767df413452b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.016927 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "13830283-fabe-488e-98a3-767df413452b" (UID: "13830283-fabe-488e-98a3-767df413452b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.020376 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g" (OuterVolumeSpecName: "kube-api-access-r5m5g") pod "fadb3764-e589-451f-8337-1c4a9bb988af" (UID: "fadb3764-e589-451f-8337-1c4a9bb988af"). InnerVolumeSpecName "kube-api-access-r5m5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.025169 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28" (OuterVolumeSpecName: "kube-api-access-5pq28") pod "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" (UID: "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e"). InnerVolumeSpecName "kube-api-access-5pq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.032958 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz" (OuterVolumeSpecName: "kube-api-access-ks5kz") pod "13830283-fabe-488e-98a3-767df413452b" (UID: "13830283-fabe-488e-98a3-767df413452b"). InnerVolumeSpecName "kube-api-access-ks5kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.079769 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.087812 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" (UID: "cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.109162 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q9tl7"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.112997 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2jp\" (UniqueName: \"kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp\") pod \"8c2dfe74-d5c9-4602-a697-cc40064871b9\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113182 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content\") pod \"8c2dfe74-d5c9-4602-a697-cc40064871b9\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113243 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities\") pod \"8c2dfe74-d5c9-4602-a697-cc40064871b9\" (UID: \"8c2dfe74-d5c9-4602-a697-cc40064871b9\") " Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113591 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5m5g\" (UniqueName: \"kubernetes.io/projected/fadb3764-e589-451f-8337-1c4a9bb988af-kube-api-access-r5m5g\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113611 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks5kz\" (UniqueName: \"kubernetes.io/projected/13830283-fabe-488e-98a3-767df413452b-kube-api-access-ks5kz\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113623 4610 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/13830283-fabe-488e-98a3-767df413452b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113632 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113640 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113649 4610 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13830283-fabe-488e-98a3-767df413452b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113658 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.113665 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pq28\" (UniqueName: \"kubernetes.io/projected/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e-kube-api-access-5pq28\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.115701 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp" (OuterVolumeSpecName: "kube-api-access-jp2jp") pod "8c2dfe74-d5c9-4602-a697-cc40064871b9" (UID: "8c2dfe74-d5c9-4602-a697-cc40064871b9"). InnerVolumeSpecName "kube-api-access-jp2jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.116266 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities" (OuterVolumeSpecName: "utilities") pod "8c2dfe74-d5c9-4602-a697-cc40064871b9" (UID: "8c2dfe74-d5c9-4602-a697-cc40064871b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: W1006 08:45:46.126283 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f273f28_c469_4f2f_a0de_bad2dd1345cb.slice/crio-b5cfcb6b167abd425c2c1c51fdcacd074c30510616a529e61619d1bdb3c16519 WatchSource:0}: Error finding container b5cfcb6b167abd425c2c1c51fdcacd074c30510616a529e61619d1bdb3c16519: Status 404 returned error can't find the container with id b5cfcb6b167abd425c2c1c51fdcacd074c30510616a529e61619d1bdb3c16519 Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.175463 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fadb3764-e589-451f-8337-1c4a9bb988af" (UID: "fadb3764-e589-451f-8337-1c4a9bb988af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.193498 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c2dfe74-d5c9-4602-a697-cc40064871b9" (UID: "8c2dfe74-d5c9-4602-a697-cc40064871b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.214110 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2jp\" (UniqueName: \"kubernetes.io/projected/8c2dfe74-d5c9-4602-a697-cc40064871b9-kube-api-access-jp2jp\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.214143 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb3764-e589-451f-8337-1c4a9bb988af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.214178 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.214190 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2dfe74-d5c9-4602-a697-cc40064871b9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.586639 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerID="37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66" exitCode=0 Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.586967 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerDied","Data":"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.586997 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vcq" event={"ID":"8c2dfe74-d5c9-4602-a697-cc40064871b9","Type":"ContainerDied","Data":"535f755bed49471791a7b9d9aeb5a07fe6c6dc8b4a7737ccf916d7b91cc604ec"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.587015 4610 scope.go:117] "RemoveContainer" containerID="37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.587194 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vcq" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.592727 4610 generic.go:334] "Generic (PLEG): container finished" podID="fadb3764-e589-451f-8337-1c4a9bb988af" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" exitCode=0 Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.592983 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sz4l" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.592788 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerDied","Data":"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.593139 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sz4l" event={"ID":"fadb3764-e589-451f-8337-1c4a9bb988af","Type":"ContainerDied","Data":"b8b05038f8da9907cc6f13254ab0051578b79c619cd8a7ebd1b8fd3ad36b738d"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.596626 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5nfg" event={"ID":"cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e","Type":"ContainerDied","Data":"77d6691cf38111f1c5eace19d93a46e57f92d15091f0ce29ccd2162509b990ec"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.596697 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5nfg" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.599483 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8dz4" event={"ID":"b812be8d-d295-4878-9af8-c0387a655dbc","Type":"ContainerDied","Data":"668515753e2e10ec64751921c4078775696dbe146a8eb0c060c0d9ea71755de6"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.599561 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8dz4" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.601298 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" event={"ID":"2f273f28-c469-4f2f-a0de-bad2dd1345cb","Type":"ContainerStarted","Data":"efc0383c9ab7b0e573262a995752a6b7686bf594c938e86ff8bba15c7dc52f14"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.601327 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" event={"ID":"2f273f28-c469-4f2f-a0de-bad2dd1345cb","Type":"ContainerStarted","Data":"b5cfcb6b167abd425c2c1c51fdcacd074c30510616a529e61619d1bdb3c16519"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.602014 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.604706 4610 scope.go:117] "RemoveContainer" containerID="994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.605133 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" event={"ID":"13830283-fabe-488e-98a3-767df413452b","Type":"ContainerDied","Data":"87bfa8b66f604442bc0a3eddcbec185c72d8cb61dc078e89113f612a5fd9a9a7"} Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.606422 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t5gpw" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.609878 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.632187 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q9tl7" podStartSLOduration=1.632169242 podStartE2EDuration="1.632169242s" podCreationTimestamp="2025-10-06 08:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:45:46.63174853 +0000 UTC m=+278.346801938" watchObservedRunningTime="2025-10-06 08:45:46.632169242 +0000 UTC m=+278.347222650" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.643828 4610 scope.go:117] "RemoveContainer" containerID="e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.652608 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.666009 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84vcq"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.670149 4610 scope.go:117] "RemoveContainer" containerID="37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.670538 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66\": container with ID starting with 37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66 not found: ID does not exist" containerID="37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.670570 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66"} err="failed to get container status \"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66\": rpc error: code = NotFound desc = could not find container \"37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66\": container with ID starting with 37ec5f9ebd15830488261fb4a7d5a03caf5b1a969a2894ff958f0cc810a17a66 not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.670595 4610 scope.go:117] "RemoveContainer" containerID="994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.672657 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4\": container with ID starting with 994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4 not found: ID does not exist" containerID="994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.672710 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4"} err="failed to get container status \"994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4\": rpc error: code = NotFound desc = could not find container \"994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4\": container with ID starting with 994c701be11d91684f0ae1f0527f712bf914f9974ab086316ec5ef4dda5908b4 not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.672742 4610 scope.go:117] "RemoveContainer" containerID="e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.673023 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2\": container with ID starting with e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2 not found: ID does not exist" containerID="e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.673066 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2"} err="failed to get container status \"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2\": rpc error: code = NotFound desc = could not find container \"e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2\": container with ID starting with e45ee3e0b6428e30cb885dbe57c87365964adcdfe6897f36ab82691b300d7bc2 not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.673086 4610 scope.go:117] "RemoveContainer" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.695648 4610 scope.go:117] "RemoveContainer" containerID="716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.699882 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.705371 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8dz4"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.732486 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.734922 4610 scope.go:117] "RemoveContainer" containerID="ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.737141 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t5gpw"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.743288 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.746774 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5nfg"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.751080 4610 scope.go:117] "RemoveContainer" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.751521 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df\": container with ID starting with c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df not found: ID does not exist" containerID="c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.751606 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df"} err="failed to get container status \"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df\": rpc error: code = NotFound desc = could not find container \"c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df\": container with ID starting with c32fd993ff4c2814d315220e5d7a4cf11a5b29cd64cfecb8fd17827bffc043df not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.751649 4610 scope.go:117] "RemoveContainer" containerID="716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.752018 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407\": container with ID starting with 716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407 not found: ID does not exist" containerID="716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.752107 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407"} err="failed to get container status \"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407\": rpc error: code = NotFound desc = could not find container \"716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407\": container with ID starting with 716b130867193c3c24720e750c608e57c016d0d26397e39131ab236e68602407 not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.752137 4610 scope.go:117] "RemoveContainer" containerID="ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857" Oct 06 08:45:46 crc kubenswrapper[4610]: E1006 08:45:46.752430 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857\": container with ID starting with ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857 not found: ID does not exist" containerID="ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.752499 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857"} err="failed to get container status \"ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857\": rpc error: code = NotFound desc = could not find container \"ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857\": container with ID starting with ba37cd1983c23bb528841ea7bc2faf8589c1f750eac8960fc5b8aee7f3b06857 not found: ID does not exist" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.752522 4610 scope.go:117] "RemoveContainer" containerID="68b8c6f605c526a41138896a1507177fe0b87ca94f1a5a5c3d4a931590e0ca16" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.753167 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.756521 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2sz4l"] Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.766167 4610 scope.go:117] "RemoveContainer" containerID="6727d94712f4ccf0542670e2320176c6780560924ce1df1a7c6181c0d75dd9da" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.780759 4610 scope.go:117] "RemoveContainer" containerID="cbd5ddb32aecbfb1839030f532732c496ef52764bfa5419a3138655f181f6ced" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.794317 4610 scope.go:117] "RemoveContainer" containerID="b485b2ae9f6c1d7c8f36165f009edbd7a6b66479ad1395c82af6cb65c10148f4" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.811658 4610 scope.go:117] "RemoveContainer" containerID="299a83729aad744562c2fa14a67504640cd34cd6343504e40ee655f46b2b4b37" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.829554 4610 scope.go:117] "RemoveContainer" containerID="a64a4f970a0bad7ea6ac834b621fa00ed867facbcdfd0530a24d7595d73cb69d" Oct 06 08:45:46 crc kubenswrapper[4610]: I1006 08:45:46.845191 4610 scope.go:117] "RemoveContainer" containerID="6cccc804c136975cd759ff8806d395cd73e72a1bda7c899ae6210fad4355bfed" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.078477 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13830283-fabe-488e-98a3-767df413452b" path="/var/lib/kubelet/pods/13830283-fabe-488e-98a3-767df413452b/volumes" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.079028 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" path="/var/lib/kubelet/pods/8c2dfe74-d5c9-4602-a697-cc40064871b9/volumes" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.079709 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" path="/var/lib/kubelet/pods/b812be8d-d295-4878-9af8-c0387a655dbc/volumes" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.081410 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" path="/var/lib/kubelet/pods/cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e/volumes" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.082030 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" path="/var/lib/kubelet/pods/fadb3764-e589-451f-8337-1c4a9bb988af/volumes" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591291 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trtt8"] Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591639 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591660 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591672 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591680 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591694 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591703 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591711 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591719 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591732 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591738 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591747 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591753 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591762 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591768 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591776 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591782 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591796 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591803 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591809 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591839 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="extract-content" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591847 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591853 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591864 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591870 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: E1006 08:45:47.591878 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591885 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="extract-utilities" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591982 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="13830283-fabe-488e-98a3-767df413452b" containerName="marketplace-operator" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.591998 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="fadb3764-e589-451f-8337-1c4a9bb988af" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.592005 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcf6d26-e4c1-45a2-bdf7-f6c5c8f9461e" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.592013 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2dfe74-d5c9-4602-a697-cc40064871b9" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.592019 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b812be8d-d295-4878-9af8-c0387a655dbc" containerName="registry-server" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.592950 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.599637 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.642125 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trtt8"] Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.732735 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-catalog-content\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.732864 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-utilities\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.732940 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xpg\" (UniqueName: \"kubernetes.io/projected/1db48478-61a7-46e8-87f2-7c4201194e49-kube-api-access-88xpg\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.788416 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6x4n"] Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.789533 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.802916 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6x4n"] Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.809557 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.833792 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-catalog-content\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.834139 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-utilities\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.834212 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xpg\" (UniqueName: \"kubernetes.io/projected/1db48478-61a7-46e8-87f2-7c4201194e49-kube-api-access-88xpg\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.834442 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-catalog-content\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.834557 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db48478-61a7-46e8-87f2-7c4201194e49-utilities\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.861639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xpg\" (UniqueName: \"kubernetes.io/projected/1db48478-61a7-46e8-87f2-7c4201194e49-kube-api-access-88xpg\") pod \"redhat-marketplace-trtt8\" (UID: \"1db48478-61a7-46e8-87f2-7c4201194e49\") " pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.912589 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.935130 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzxd\" (UniqueName: \"kubernetes.io/projected/13613787-1366-4ea9-8add-d39428f1514f-kube-api-access-ddzxd\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.935240 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-utilities\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:47 crc kubenswrapper[4610]: I1006 08:45:47.935263 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-catalog-content\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.035949 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzxd\" (UniqueName: \"kubernetes.io/projected/13613787-1366-4ea9-8add-d39428f1514f-kube-api-access-ddzxd\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.035997 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-utilities\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.036016 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-catalog-content\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.036436 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-catalog-content\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.036940 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13613787-1366-4ea9-8add-d39428f1514f-utilities\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.065142 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzxd\" (UniqueName: \"kubernetes.io/projected/13613787-1366-4ea9-8add-d39428f1514f-kube-api-access-ddzxd\") pod \"redhat-operators-x6x4n\" (UID: \"13613787-1366-4ea9-8add-d39428f1514f\") " pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.086789 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trtt8"] Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.113902 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.327797 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6x4n"] Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.625946 4610 generic.go:334] "Generic (PLEG): container finished" podID="13613787-1366-4ea9-8add-d39428f1514f" containerID="19147f7c576fe297da3679abbe056a247d7b48b9f75e0bde64e38d1529d71d4e" exitCode=0 Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.626057 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6x4n" event={"ID":"13613787-1366-4ea9-8add-d39428f1514f","Type":"ContainerDied","Data":"19147f7c576fe297da3679abbe056a247d7b48b9f75e0bde64e38d1529d71d4e"} Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.626305 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6x4n" event={"ID":"13613787-1366-4ea9-8add-d39428f1514f","Type":"ContainerStarted","Data":"fd43ff3cd9411b204e1bef16c13a4f447e9a9d7a7f689ab9bed7a6b6ede7ff4e"} Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.628992 4610 generic.go:334] "Generic (PLEG): container finished" podID="1db48478-61a7-46e8-87f2-7c4201194e49" containerID="503ff4bb75bef47e0c29f1c1cfde00111704e9abd7178e7cde53846550b0d20c" exitCode=0 Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.629742 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trtt8" event={"ID":"1db48478-61a7-46e8-87f2-7c4201194e49","Type":"ContainerDied","Data":"503ff4bb75bef47e0c29f1c1cfde00111704e9abd7178e7cde53846550b0d20c"} Oct 06 08:45:48 crc kubenswrapper[4610]: I1006 08:45:48.629770 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trtt8" event={"ID":"1db48478-61a7-46e8-87f2-7c4201194e49","Type":"ContainerStarted","Data":"4e2249167f31bedfe730ff38bba367b6a9b2e7a13a47135f80d9a9e723baa2cb"} Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:49.987180 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f22rd"] Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:49.990950 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:49.992781 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:49.998710 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f22rd"] Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.070678 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-catalog-content\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.070719 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gg2k\" (UniqueName: \"kubernetes.io/projected/a67f97c9-f65d-4818-9b7d-568ab33ac02f-kube-api-access-6gg2k\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.070753 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-utilities\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.171912 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-utilities\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.172007 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-catalog-content\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.172038 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gg2k\" (UniqueName: \"kubernetes.io/projected/a67f97c9-f65d-4818-9b7d-568ab33ac02f-kube-api-access-6gg2k\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.173328 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-utilities\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.173385 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67f97c9-f65d-4818-9b7d-568ab33ac02f-catalog-content\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.189582 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrngf"] Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.191005 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.193815 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.196281 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gg2k\" (UniqueName: \"kubernetes.io/projected/a67f97c9-f65d-4818-9b7d-568ab33ac02f-kube-api-access-6gg2k\") pod \"community-operators-f22rd\" (UID: \"a67f97c9-f65d-4818-9b7d-568ab33ac02f\") " pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.201197 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrngf"] Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.274598 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-catalog-content\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.274649 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-utilities\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.274693 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8chd\" (UniqueName: \"kubernetes.io/projected/bee7def6-3268-4497-b20c-c0133ade55de-kube-api-access-x8chd\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.344003 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.376214 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-catalog-content\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.376281 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-utilities\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.376325 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8chd\" (UniqueName: \"kubernetes.io/projected/bee7def6-3268-4497-b20c-c0133ade55de-kube-api-access-x8chd\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.376848 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-catalog-content\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.376985 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee7def6-3268-4497-b20c-c0133ade55de-utilities\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.394436 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8chd\" (UniqueName: \"kubernetes.io/projected/bee7def6-3268-4497-b20c-c0133ade55de-kube-api-access-x8chd\") pod \"certified-operators-qrngf\" (UID: \"bee7def6-3268-4497-b20c-c0133ade55de\") " pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.552105 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.647556 4610 generic.go:334] "Generic (PLEG): container finished" podID="1db48478-61a7-46e8-87f2-7c4201194e49" containerID="c270568dc7c7723704ea04af9ed70c3741a716fc0c3531bc25766e53ca02994d" exitCode=0 Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.647632 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trtt8" event={"ID":"1db48478-61a7-46e8-87f2-7c4201194e49","Type":"ContainerDied","Data":"c270568dc7c7723704ea04af9ed70c3741a716fc0c3531bc25766e53ca02994d"} Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.652444 4610 generic.go:334] "Generic (PLEG): container finished" podID="13613787-1366-4ea9-8add-d39428f1514f" containerID="26179b3d0780ccf3b9c9ca7697e0f887a54b69bca72a58f96d84129c18adf42b" exitCode=0 Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.652482 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6x4n" event={"ID":"13613787-1366-4ea9-8add-d39428f1514f","Type":"ContainerDied","Data":"26179b3d0780ccf3b9c9ca7697e0f887a54b69bca72a58f96d84129c18adf42b"} Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.747743 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrngf"] Oct 06 08:45:50 crc kubenswrapper[4610]: W1006 08:45:50.756239 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee7def6_3268_4497_b20c_c0133ade55de.slice/crio-85a5824009e89c8ed0dfc9cea565ac6a4bbef9b4293dcb612ad75e9b0d26cddd WatchSource:0}: Error finding container 85a5824009e89c8ed0dfc9cea565ac6a4bbef9b4293dcb612ad75e9b0d26cddd: Status 404 returned error can't find the container with id 85a5824009e89c8ed0dfc9cea565ac6a4bbef9b4293dcb612ad75e9b0d26cddd Oct 06 08:45:50 crc kubenswrapper[4610]: I1006 08:45:50.790466 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f22rd"] Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.659185 4610 generic.go:334] "Generic (PLEG): container finished" podID="bee7def6-3268-4497-b20c-c0133ade55de" containerID="5f446fd3f3ac41411df7a6cb227f2698124c6190568d94088c1f4e6c64d4b8a6" exitCode=0 Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.659517 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrngf" event={"ID":"bee7def6-3268-4497-b20c-c0133ade55de","Type":"ContainerDied","Data":"5f446fd3f3ac41411df7a6cb227f2698124c6190568d94088c1f4e6c64d4b8a6"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.659563 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrngf" event={"ID":"bee7def6-3268-4497-b20c-c0133ade55de","Type":"ContainerStarted","Data":"85a5824009e89c8ed0dfc9cea565ac6a4bbef9b4293dcb612ad75e9b0d26cddd"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.663911 4610 generic.go:334] "Generic (PLEG): container finished" podID="a67f97c9-f65d-4818-9b7d-568ab33ac02f" containerID="9746b459336b067d87ca29259991af7aae02f7842bca9926a9ad249a1fbbc352" exitCode=0 Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.663989 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f22rd" event={"ID":"a67f97c9-f65d-4818-9b7d-568ab33ac02f","Type":"ContainerDied","Data":"9746b459336b067d87ca29259991af7aae02f7842bca9926a9ad249a1fbbc352"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.664679 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f22rd" event={"ID":"a67f97c9-f65d-4818-9b7d-568ab33ac02f","Type":"ContainerStarted","Data":"6f600f23cc5a659b54273af25f0b1d06c97034ab06dd9ccf3de7c40f999fd5a8"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.668870 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6x4n" event={"ID":"13613787-1366-4ea9-8add-d39428f1514f","Type":"ContainerStarted","Data":"d8c57a166684fbdd22a652b66c7c4fcfc7e1c2a142ba423a87f301822c9f6c00"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.672166 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trtt8" event={"ID":"1db48478-61a7-46e8-87f2-7c4201194e49","Type":"ContainerStarted","Data":"f90d3b207d2970bb4ec245deaa6a882444affa43c4c6c42a6e3bf2a1f988ae96"} Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.703002 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trtt8" podStartSLOduration=2.076052406 podStartE2EDuration="4.702983147s" podCreationTimestamp="2025-10-06 08:45:47 +0000 UTC" firstStartedPulling="2025-10-06 08:45:48.634335342 +0000 UTC m=+280.349388730" lastFinishedPulling="2025-10-06 08:45:51.261266083 +0000 UTC m=+282.976319471" observedRunningTime="2025-10-06 08:45:51.69855835 +0000 UTC m=+283.413611738" watchObservedRunningTime="2025-10-06 08:45:51.702983147 +0000 UTC m=+283.418036535" Oct 06 08:45:51 crc kubenswrapper[4610]: I1006 08:45:51.734810 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6x4n" podStartSLOduration=2.242764857 podStartE2EDuration="4.734793267s" podCreationTimestamp="2025-10-06 08:45:47 +0000 UTC" firstStartedPulling="2025-10-06 08:45:48.627176037 +0000 UTC m=+280.342229415" lastFinishedPulling="2025-10-06 08:45:51.119204437 +0000 UTC m=+282.834257825" observedRunningTime="2025-10-06 08:45:51.733792239 +0000 UTC m=+283.448845647" watchObservedRunningTime="2025-10-06 08:45:51.734793267 +0000 UTC m=+283.449846645" Oct 06 08:45:52 crc kubenswrapper[4610]: I1006 08:45:52.680266 4610 generic.go:334] "Generic (PLEG): container finished" podID="a67f97c9-f65d-4818-9b7d-568ab33ac02f" containerID="d15398679c15899c1cb9335fe5e9ef2761452aa2bf7c68d5af92fc7da551b5de" exitCode=0 Oct 06 08:45:52 crc kubenswrapper[4610]: I1006 08:45:52.680505 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f22rd" event={"ID":"a67f97c9-f65d-4818-9b7d-568ab33ac02f","Type":"ContainerDied","Data":"d15398679c15899c1cb9335fe5e9ef2761452aa2bf7c68d5af92fc7da551b5de"} Oct 06 08:45:52 crc kubenswrapper[4610]: I1006 08:45:52.683225 4610 generic.go:334] "Generic (PLEG): container finished" podID="bee7def6-3268-4497-b20c-c0133ade55de" containerID="1a74d0582935cca7179469bc19aadf57389c3a4f090b8fc675fce34b60825afd" exitCode=0 Oct 06 08:45:52 crc kubenswrapper[4610]: I1006 08:45:52.684277 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrngf" event={"ID":"bee7def6-3268-4497-b20c-c0133ade55de","Type":"ContainerDied","Data":"1a74d0582935cca7179469bc19aadf57389c3a4f090b8fc675fce34b60825afd"} Oct 06 08:45:54 crc kubenswrapper[4610]: I1006 08:45:54.695272 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f22rd" event={"ID":"a67f97c9-f65d-4818-9b7d-568ab33ac02f","Type":"ContainerStarted","Data":"233ff94eff76af6ae3037413498707a244073a3dc8ebc6fc8cbaa5caa10de88e"} Oct 06 08:45:54 crc kubenswrapper[4610]: I1006 08:45:54.697135 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrngf" event={"ID":"bee7def6-3268-4497-b20c-c0133ade55de","Type":"ContainerStarted","Data":"6510c233bc7122b990e2ead830397dd2f89ab80dd71f27596421a3dd2307fb7f"} Oct 06 08:45:54 crc kubenswrapper[4610]: I1006 08:45:54.737762 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrngf" podStartSLOduration=2.9112285509999998 podStartE2EDuration="4.737748542s" podCreationTimestamp="2025-10-06 08:45:50 +0000 UTC" firstStartedPulling="2025-10-06 08:45:51.660908932 +0000 UTC m=+283.375962330" lastFinishedPulling="2025-10-06 08:45:53.487428933 +0000 UTC m=+285.202482321" observedRunningTime="2025-10-06 08:45:54.733955273 +0000 UTC m=+286.449008661" watchObservedRunningTime="2025-10-06 08:45:54.737748542 +0000 UTC m=+286.452801930" Oct 06 08:45:54 crc kubenswrapper[4610]: I1006 08:45:54.738387 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f22rd" podStartSLOduration=4.228496142 podStartE2EDuration="5.73838277s" podCreationTimestamp="2025-10-06 08:45:49 +0000 UTC" firstStartedPulling="2025-10-06 08:45:51.666118261 +0000 UTC m=+283.381171649" lastFinishedPulling="2025-10-06 08:45:53.176004889 +0000 UTC m=+284.891058277" observedRunningTime="2025-10-06 08:45:54.716827823 +0000 UTC m=+286.431881211" watchObservedRunningTime="2025-10-06 08:45:54.73838277 +0000 UTC m=+286.453436158" Oct 06 08:45:57 crc kubenswrapper[4610]: I1006 08:45:57.913077 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:57 crc kubenswrapper[4610]: I1006 08:45:57.913698 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:57 crc kubenswrapper[4610]: I1006 08:45:57.955150 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:45:58 crc kubenswrapper[4610]: I1006 08:45:58.114935 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:58 crc kubenswrapper[4610]: I1006 08:45:58.115093 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:58 crc kubenswrapper[4610]: I1006 08:45:58.152710 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:58 crc kubenswrapper[4610]: I1006 08:45:58.774374 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6x4n" Oct 06 08:45:58 crc kubenswrapper[4610]: I1006 08:45:58.776175 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trtt8" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.345464 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.345748 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.387996 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.552874 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.552914 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.592712 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.790134 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f22rd" Oct 06 08:46:00 crc kubenswrapper[4610]: I1006 08:46:00.793915 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrngf" Oct 06 08:47:16 crc kubenswrapper[4610]: I1006 08:47:16.469264 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:47:16 crc kubenswrapper[4610]: I1006 08:47:16.470857 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:47:46 crc kubenswrapper[4610]: I1006 08:47:46.469720 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:47:46 crc kubenswrapper[4610]: I1006 08:47:46.470290 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:48:16 crc kubenswrapper[4610]: I1006 08:48:16.469637 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:48:16 crc kubenswrapper[4610]: I1006 08:48:16.470242 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:48:16 crc kubenswrapper[4610]: I1006 08:48:16.470286 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:48:16 crc kubenswrapper[4610]: I1006 08:48:16.470854 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:48:16 crc kubenswrapper[4610]: I1006 08:48:16.470900 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd" gracePeriod=600 Oct 06 08:48:17 crc kubenswrapper[4610]: I1006 08:48:17.506324 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd" exitCode=0 Oct 06 08:48:17 crc kubenswrapper[4610]: I1006 08:48:17.506423 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd"} Oct 06 08:48:17 crc kubenswrapper[4610]: I1006 08:48:17.506862 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec"} Oct 06 08:48:17 crc kubenswrapper[4610]: I1006 08:48:17.506897 4610 scope.go:117] "RemoveContainer" containerID="98c32478f7d9ed83c7ea5cd247985d59cef74dd05bcc2c93eb20853cafbc1c99" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.149046 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mqg5z"] Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.150324 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.160041 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mqg5z"] Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326370 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326426 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-registry-certificates\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326449 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-registry-tls\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326479 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-bound-sa-token\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326505 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8b249be-3037-4620-bd38-66910338af17-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326529 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8b249be-3037-4620-bd38-66910338af17-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326553 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9pf6\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-kube-api-access-z9pf6\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.326572 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-trusted-ca\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.359947 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.429402 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9pf6\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-kube-api-access-z9pf6\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.429465 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-trusted-ca\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.429534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-registry-certificates\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.431161 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-registry-tls\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.431244 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-bound-sa-token\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.431281 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8b249be-3037-4620-bd38-66910338af17-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.431331 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8b249be-3037-4620-bd38-66910338af17-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.432298 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-registry-certificates\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.432748 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8b249be-3037-4620-bd38-66910338af17-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.433508 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8b249be-3037-4620-bd38-66910338af17-trusted-ca\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.439045 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-registry-tls\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.442876 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8b249be-3037-4620-bd38-66910338af17-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.450391 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9pf6\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-kube-api-access-z9pf6\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.451055 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8b249be-3037-4620-bd38-66910338af17-bound-sa-token\") pod \"image-registry-66df7c8f76-mqg5z\" (UID: \"b8b249be-3037-4620-bd38-66910338af17\") " pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.468171 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:34 crc kubenswrapper[4610]: I1006 08:48:34.655788 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mqg5z"] Oct 06 08:48:34 crc kubenswrapper[4610]: W1006 08:48:34.659850 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b249be_3037_4620_bd38_66910338af17.slice/crio-9d9f089c194465454953fa0f338560fa24f9e3b8903448ac8716475a55673985 WatchSource:0}: Error finding container 9d9f089c194465454953fa0f338560fa24f9e3b8903448ac8716475a55673985: Status 404 returned error can't find the container with id 9d9f089c194465454953fa0f338560fa24f9e3b8903448ac8716475a55673985 Oct 06 08:48:35 crc kubenswrapper[4610]: I1006 08:48:35.612278 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" event={"ID":"b8b249be-3037-4620-bd38-66910338af17","Type":"ContainerStarted","Data":"b9fd6bd99d591f639855efec23e5c991bda101ffc295e766ba1b54783265e612"} Oct 06 08:48:35 crc kubenswrapper[4610]: I1006 08:48:35.612642 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:35 crc kubenswrapper[4610]: I1006 08:48:35.612656 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" event={"ID":"b8b249be-3037-4620-bd38-66910338af17","Type":"ContainerStarted","Data":"9d9f089c194465454953fa0f338560fa24f9e3b8903448ac8716475a55673985"} Oct 06 08:48:35 crc kubenswrapper[4610]: I1006 08:48:35.640897 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" podStartSLOduration=1.640876139 podStartE2EDuration="1.640876139s" podCreationTimestamp="2025-10-06 08:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:48:35.637156015 +0000 UTC m=+447.352209403" watchObservedRunningTime="2025-10-06 08:48:35.640876139 +0000 UTC m=+447.355929547" Oct 06 08:48:54 crc kubenswrapper[4610]: I1006 08:48:54.476372 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mqg5z" Oct 06 08:48:54 crc kubenswrapper[4610]: I1006 08:48:54.529008 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.572153 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" podUID="be459a27-8ce8-4825-9b01-a89a33fb81d6" containerName="registry" containerID="cri-o://df59d0bedd459dda472f5003b0de7109316e71041337dff61c465b92cfe12df9" gracePeriod=30 Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.878722 4610 generic.go:334] "Generic (PLEG): container finished" podID="be459a27-8ce8-4825-9b01-a89a33fb81d6" containerID="df59d0bedd459dda472f5003b0de7109316e71041337dff61c465b92cfe12df9" exitCode=0 Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.878761 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" event={"ID":"be459a27-8ce8-4825-9b01-a89a33fb81d6","Type":"ContainerDied","Data":"df59d0bedd459dda472f5003b0de7109316e71041337dff61c465b92cfe12df9"} Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.878785 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" event={"ID":"be459a27-8ce8-4825-9b01-a89a33fb81d6","Type":"ContainerDied","Data":"7edb562d8f185ea157088ab8cb21876b00e92b0ba1b09c5696bbd41a9a79d655"} Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.878797 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edb562d8f185ea157088ab8cb21876b00e92b0ba1b09c5696bbd41a9a79d655" Oct 06 08:49:19 crc kubenswrapper[4610]: I1006 08:49:19.885513 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071124 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071266 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071289 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071333 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071364 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nkmc\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071499 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071535 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.071873 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates\") pod \"be459a27-8ce8-4825-9b01-a89a33fb81d6\" (UID: \"be459a27-8ce8-4825-9b01-a89a33fb81d6\") " Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.072752 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.073029 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.078690 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc" (OuterVolumeSpecName: "kube-api-access-5nkmc") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "kube-api-access-5nkmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.080586 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.081377 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.082026 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.083750 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.089216 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "be459a27-8ce8-4825-9b01-a89a33fb81d6" (UID: "be459a27-8ce8-4825-9b01-a89a33fb81d6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173262 4610 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173315 4610 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be459a27-8ce8-4825-9b01-a89a33fb81d6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173343 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nkmc\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-kube-api-access-5nkmc\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173357 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173371 4610 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173387 4610 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be459a27-8ce8-4825-9b01-a89a33fb81d6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.173400 4610 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be459a27-8ce8-4825-9b01-a89a33fb81d6-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.886414 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbjqb" Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.927313 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:49:20 crc kubenswrapper[4610]: I1006 08:49:20.930899 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbjqb"] Oct 06 08:49:21 crc kubenswrapper[4610]: I1006 08:49:21.078378 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be459a27-8ce8-4825-9b01-a89a33fb81d6" path="/var/lib/kubelet/pods/be459a27-8ce8-4825-9b01-a89a33fb81d6/volumes" Oct 06 08:50:09 crc kubenswrapper[4610]: I1006 08:50:09.214407 4610 scope.go:117] "RemoveContainer" containerID="df59d0bedd459dda472f5003b0de7109316e71041337dff61c465b92cfe12df9" Oct 06 08:50:16 crc kubenswrapper[4610]: I1006 08:50:16.469543 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:50:16 crc kubenswrapper[4610]: I1006 08:50:16.469873 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:50:46 crc kubenswrapper[4610]: I1006 08:50:46.469406 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:50:46 crc kubenswrapper[4610]: I1006 08:50:46.469896 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:51:16 crc kubenswrapper[4610]: I1006 08:51:16.469657 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:51:16 crc kubenswrapper[4610]: I1006 08:51:16.470156 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:51:16 crc kubenswrapper[4610]: I1006 08:51:16.470216 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:51:16 crc kubenswrapper[4610]: I1006 08:51:16.470872 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:51:16 crc kubenswrapper[4610]: I1006 08:51:16.470929 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec" gracePeriod=600 Oct 06 08:51:17 crc kubenswrapper[4610]: I1006 08:51:17.597989 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec" exitCode=0 Oct 06 08:51:17 crc kubenswrapper[4610]: I1006 08:51:17.598066 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec"} Oct 06 08:51:17 crc kubenswrapper[4610]: I1006 08:51:17.598558 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6"} Oct 06 08:51:17 crc kubenswrapper[4610]: I1006 08:51:17.598586 4610 scope.go:117] "RemoveContainer" containerID="3752446b66bca6bd4c906f7907b7377a8f00a694c4d89384cbc86783dc5e79dd" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.297610 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9rmxk"] Oct 06 08:52:45 crc kubenswrapper[4610]: E1006 08:52:45.298296 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be459a27-8ce8-4825-9b01-a89a33fb81d6" containerName="registry" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.298310 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="be459a27-8ce8-4825-9b01-a89a33fb81d6" containerName="registry" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.298400 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="be459a27-8ce8-4825-9b01-a89a33fb81d6" containerName="registry" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.298786 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.301312 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.301575 4610 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-77zk6" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.301730 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.308876 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-pxwn5"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.309634 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-pxwn5" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.310962 4610 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wrn9g" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.316399 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9rmxk"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.324514 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-pxwn5"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.330476 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zslmc"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.331136 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.332411 4610 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-h49kg" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.344569 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zslmc"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.489441 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pkf\" (UniqueName: \"kubernetes.io/projected/40b562f4-5aac-4a81-b2b9-7a449b662cfc-kube-api-access-x4pkf\") pod \"cert-manager-cainjector-7f985d654d-9rmxk\" (UID: \"40b562f4-5aac-4a81-b2b9-7a449b662cfc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.489514 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4sjh\" (UniqueName: \"kubernetes.io/projected/66256ae8-d5ea-4800-85a5-5b61f7475b8e-kube-api-access-k4sjh\") pod \"cert-manager-5b446d88c5-pxwn5\" (UID: \"66256ae8-d5ea-4800-85a5-5b61f7475b8e\") " pod="cert-manager/cert-manager-5b446d88c5-pxwn5" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.489539 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9swv\" (UniqueName: \"kubernetes.io/projected/e7133e75-e1cc-410d-828b-18221c64707c-kube-api-access-z9swv\") pod \"cert-manager-webhook-5655c58dd6-zslmc\" (UID: \"e7133e75-e1cc-410d-828b-18221c64707c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.590348 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9swv\" (UniqueName: \"kubernetes.io/projected/e7133e75-e1cc-410d-828b-18221c64707c-kube-api-access-z9swv\") pod \"cert-manager-webhook-5655c58dd6-zslmc\" (UID: \"e7133e75-e1cc-410d-828b-18221c64707c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.590477 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pkf\" (UniqueName: \"kubernetes.io/projected/40b562f4-5aac-4a81-b2b9-7a449b662cfc-kube-api-access-x4pkf\") pod \"cert-manager-cainjector-7f985d654d-9rmxk\" (UID: \"40b562f4-5aac-4a81-b2b9-7a449b662cfc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.590524 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4sjh\" (UniqueName: \"kubernetes.io/projected/66256ae8-d5ea-4800-85a5-5b61f7475b8e-kube-api-access-k4sjh\") pod \"cert-manager-5b446d88c5-pxwn5\" (UID: \"66256ae8-d5ea-4800-85a5-5b61f7475b8e\") " pod="cert-manager/cert-manager-5b446d88c5-pxwn5" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.610756 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4sjh\" (UniqueName: \"kubernetes.io/projected/66256ae8-d5ea-4800-85a5-5b61f7475b8e-kube-api-access-k4sjh\") pod \"cert-manager-5b446d88c5-pxwn5\" (UID: \"66256ae8-d5ea-4800-85a5-5b61f7475b8e\") " pod="cert-manager/cert-manager-5b446d88c5-pxwn5" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.613814 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9swv\" (UniqueName: \"kubernetes.io/projected/e7133e75-e1cc-410d-828b-18221c64707c-kube-api-access-z9swv\") pod \"cert-manager-webhook-5655c58dd6-zslmc\" (UID: \"e7133e75-e1cc-410d-828b-18221c64707c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.615747 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pkf\" (UniqueName: \"kubernetes.io/projected/40b562f4-5aac-4a81-b2b9-7a449b662cfc-kube-api-access-x4pkf\") pod \"cert-manager-cainjector-7f985d654d-9rmxk\" (UID: \"40b562f4-5aac-4a81-b2b9-7a449b662cfc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.619787 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.628291 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-pxwn5" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.644726 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.884980 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zslmc"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.896419 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.938811 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9rmxk"] Oct 06 08:52:45 crc kubenswrapper[4610]: I1006 08:52:45.953093 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-pxwn5"] Oct 06 08:52:45 crc kubenswrapper[4610]: W1006 08:52:45.966437 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66256ae8_d5ea_4800_85a5_5b61f7475b8e.slice/crio-437caebf19a8fb35de5288eec2d6483cdc49a3440190c660668fce6045eb316d WatchSource:0}: Error finding container 437caebf19a8fb35de5288eec2d6483cdc49a3440190c660668fce6045eb316d: Status 404 returned error can't find the container with id 437caebf19a8fb35de5288eec2d6483cdc49a3440190c660668fce6045eb316d Oct 06 08:52:46 crc kubenswrapper[4610]: I1006 08:52:46.086479 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" event={"ID":"e7133e75-e1cc-410d-828b-18221c64707c","Type":"ContainerStarted","Data":"41d0daa3631f375f79253da43ac1fc3ea9b3a59271aadfc8f798140451530d58"} Oct 06 08:52:46 crc kubenswrapper[4610]: I1006 08:52:46.087705 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" event={"ID":"40b562f4-5aac-4a81-b2b9-7a449b662cfc","Type":"ContainerStarted","Data":"0b3d88081250b8a6ea77c8badea50fec0dd86d801dfaf91b7559f0894ba8e344"} Oct 06 08:52:46 crc kubenswrapper[4610]: I1006 08:52:46.088890 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-pxwn5" event={"ID":"66256ae8-d5ea-4800-85a5-5b61f7475b8e","Type":"ContainerStarted","Data":"437caebf19a8fb35de5288eec2d6483cdc49a3440190c660668fce6045eb316d"} Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.117930 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" event={"ID":"e7133e75-e1cc-410d-828b-18221c64707c","Type":"ContainerStarted","Data":"538ebccc78a1da734df8c15deca24d1f414f08e5272d428a46c763f3a4086f6e"} Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.118501 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.119248 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" event={"ID":"40b562f4-5aac-4a81-b2b9-7a449b662cfc","Type":"ContainerStarted","Data":"65b85c20092bd056e3b3ce419fed7981fdd0a011d67d5cf1f43776431b26409d"} Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.120769 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-pxwn5" event={"ID":"66256ae8-d5ea-4800-85a5-5b61f7475b8e","Type":"ContainerStarted","Data":"b5ceba15358e56a7a1fd8a3197152ea0150405d77b4431d8169987587d0bf6ec"} Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.146022 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" podStartSLOduration=1.943201409 podStartE2EDuration="6.14600504s" podCreationTimestamp="2025-10-06 08:52:45 +0000 UTC" firstStartedPulling="2025-10-06 08:52:45.895118285 +0000 UTC m=+697.610171673" lastFinishedPulling="2025-10-06 08:52:50.097921916 +0000 UTC m=+701.812975304" observedRunningTime="2025-10-06 08:52:51.144849598 +0000 UTC m=+702.859902996" watchObservedRunningTime="2025-10-06 08:52:51.14600504 +0000 UTC m=+702.861058428" Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.171600 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-9rmxk" podStartSLOduration=2.006682065 podStartE2EDuration="6.171583401s" podCreationTimestamp="2025-10-06 08:52:45 +0000 UTC" firstStartedPulling="2025-10-06 08:52:45.954375088 +0000 UTC m=+697.669428476" lastFinishedPulling="2025-10-06 08:52:50.119276414 +0000 UTC m=+701.834329812" observedRunningTime="2025-10-06 08:52:51.168060256 +0000 UTC m=+702.883113644" watchObservedRunningTime="2025-10-06 08:52:51.171583401 +0000 UTC m=+702.886636789" Oct 06 08:52:51 crc kubenswrapper[4610]: I1006 08:52:51.191483 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-pxwn5" podStartSLOduration=2.039609306 podStartE2EDuration="6.191464419s" podCreationTimestamp="2025-10-06 08:52:45 +0000 UTC" firstStartedPulling="2025-10-06 08:52:45.968271214 +0000 UTC m=+697.683324602" lastFinishedPulling="2025-10-06 08:52:50.120126327 +0000 UTC m=+701.835179715" observedRunningTime="2025-10-06 08:52:51.189379833 +0000 UTC m=+702.904433221" watchObservedRunningTime="2025-10-06 08:52:51.191464419 +0000 UTC m=+702.906517807" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.059798 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pqkpj"] Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060539 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-controller" containerID="cri-o://0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060580 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="nbdb" containerID="cri-o://33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060665 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="northd" containerID="cri-o://3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060714 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060741 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="sbdb" containerID="cri-o://6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060763 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-node" containerID="cri-o://edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.060809 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-acl-logging" containerID="cri-o://447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.115835 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" containerID="cri-o://61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" gracePeriod=30 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.147235 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdc9x_03a2c34b-edd9-489b-a8e6-23502cdeb309/kube-multus/1.log" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.147936 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdc9x_03a2c34b-edd9-489b-a8e6-23502cdeb309/kube-multus/0.log" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.147977 4610 generic.go:334] "Generic (PLEG): container finished" podID="03a2c34b-edd9-489b-a8e6-23502cdeb309" containerID="8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68" exitCode=2 Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.148006 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerDied","Data":"8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68"} Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.148037 4610 scope.go:117] "RemoveContainer" containerID="35ddafebbfcb2a8548f4326ff1b8d4bc4548e75bdfa8b1401308ba7d4cdeef91" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.148556 4610 scope.go:117] "RemoveContainer" containerID="8d72531f94453a58e835432cded9f9d9b3b206932f1d8bbe2a80c2a9f1ef7d68" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.410866 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/2.log" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.412484 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovn-acl-logging/0.log" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.412835 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovn-controller/0.log" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.413193 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.462914 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ds7hf"] Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463121 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="northd" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463132 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="northd" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463142 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463148 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463159 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463165 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463173 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-acl-logging" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463179 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-acl-logging" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463187 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kubecfg-setup" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463193 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kubecfg-setup" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463203 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463208 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463215 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-node" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463220 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-node" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463226 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463232 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463241 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="sbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463246 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="sbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463252 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="nbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463257 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="nbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463265 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463271 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463351 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="nbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463361 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-acl-logging" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463370 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463379 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463387 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463393 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="sbdb" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463402 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="kube-rbac-proxy-node" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463409 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="northd" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463416 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovn-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: E1006 08:52:55.463502 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463508 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463865 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.463878 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerName="ovnkube-controller" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.465334 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533402 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533450 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533471 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533487 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533511 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533554 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533585 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log" (OuterVolumeSpecName: "node-log") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533588 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533603 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533637 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533604 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533838 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533868 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533886 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533903 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533918 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533931 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533955 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533955 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533971 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533979 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash" (OuterVolumeSpecName: "host-slash") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533989 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534018 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534032 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534062 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534081 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534098 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fv5s\" (UniqueName: \"kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534116 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket\") pod \"980266ef-4c63-4532-8b33-25fa1c57a9a7\" (UID: \"980266ef-4c63-4532-8b33-25fa1c57a9a7\") " Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534173 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-slash\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534192 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-script-lib\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534213 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534226 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-node-log\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534241 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qcjm\" (UniqueName: \"kubernetes.io/projected/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-kube-api-access-8qcjm\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534258 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-var-lib-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534286 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-config\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534303 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-systemd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534320 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534336 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovn-node-metrics-cert\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534350 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-bin\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534363 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-kubelet\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534375 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-netd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534393 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-netns\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534407 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-ovn\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534425 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534443 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-etc-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534469 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-log-socket\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534484 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-systemd-units\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534498 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-env-overrides\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534528 4610 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534538 4610 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534546 4610 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534555 4610 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534564 4610 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534572 4610 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534581 4610 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534589 4610 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.533999 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534229 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534909 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.534929 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.535000 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.535010 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket" (OuterVolumeSpecName: "log-socket") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.535037 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.535113 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.535295 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.539796 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s" (OuterVolumeSpecName: "kube-api-access-8fv5s") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "kube-api-access-8fv5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.543250 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.548232 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "980266ef-4c63-4532-8b33-25fa1c57a9a7" (UID: "980266ef-4c63-4532-8b33-25fa1c57a9a7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635309 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-systemd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635362 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635381 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovn-node-metrics-cert\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635439 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-bin\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635457 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-kubelet\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635471 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-netd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635488 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-netns\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635505 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-ovn\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635529 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635550 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-etc-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635577 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-log-socket\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635595 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-systemd-units\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635612 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-env-overrides\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635627 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-slash\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635644 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-script-lib\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635664 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635679 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-node-log\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635693 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qcjm\" (UniqueName: \"kubernetes.io/projected/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-kube-api-access-8qcjm\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635708 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-var-lib-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635723 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-config\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635760 4610 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635770 4610 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635779 4610 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635789 4610 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635799 4610 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635807 4610 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635816 4610 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635825 4610 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635833 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fv5s\" (UniqueName: \"kubernetes.io/projected/980266ef-4c63-4532-8b33-25fa1c57a9a7-kube-api-access-8fv5s\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635841 4610 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635850 4610 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/980266ef-4c63-4532-8b33-25fa1c57a9a7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.635858 4610 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/980266ef-4c63-4532-8b33-25fa1c57a9a7-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.636411 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-config\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.636690 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-systemd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.636714 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637085 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-log-socket\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637120 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-run-ovn\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637143 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637167 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-kubelet\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637185 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-bin\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637206 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-node-log\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637221 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-etc-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637236 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-slash\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637255 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-ovn-kubernetes\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637262 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-systemd-units\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637257 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-cni-netd\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637290 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-var-lib-openvswitch\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637303 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-host-run-netns\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.637856 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-env-overrides\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.638761 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovnkube-script-lib\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.643114 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-ovn-node-metrics-cert\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.647558 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-zslmc" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.656417 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qcjm\" (UniqueName: \"kubernetes.io/projected/50589b58-901a-4cb3-bfb8-6bde5b5c3c99-kube-api-access-8qcjm\") pod \"ovnkube-node-ds7hf\" (UID: \"50589b58-901a-4cb3-bfb8-6bde5b5c3c99\") " pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: I1006 08:52:55.778560 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:52:55 crc kubenswrapper[4610]: W1006 08:52:55.798161 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50589b58_901a_4cb3_bfb8_6bde5b5c3c99.slice/crio-b2c8870dae798cadf9c12d06827dd647b87ff4fda84573bc2edb7afdada8e8fe WatchSource:0}: Error finding container b2c8870dae798cadf9c12d06827dd647b87ff4fda84573bc2edb7afdada8e8fe: Status 404 returned error can't find the container with id b2c8870dae798cadf9c12d06827dd647b87ff4fda84573bc2edb7afdada8e8fe Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.155952 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovnkube-controller/2.log" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.159921 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovn-acl-logging/0.log" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.160814 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkpj_980266ef-4c63-4532-8b33-25fa1c57a9a7/ovn-controller/0.log" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161263 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161293 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161303 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161314 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161323 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161331 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161340 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" exitCode=143 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161348 4610 generic.go:334] "Generic (PLEG): container finished" podID="980266ef-4c63-4532-8b33-25fa1c57a9a7" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" exitCode=143 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161349 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161383 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161396 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161406 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161415 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161431 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161414 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161392 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161442 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161559 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161571 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161579 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161585 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161592 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161600 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161607 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161614 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161626 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161642 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161651 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161659 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161666 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161673 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161679 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161686 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161705 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161713 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161719 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161731 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161746 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161756 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161762 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161769 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161807 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161818 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161826 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161833 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161840 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161848 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161859 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkpj" event={"ID":"980266ef-4c63-4532-8b33-25fa1c57a9a7","Type":"ContainerDied","Data":"f1c4faa0cea60bcd12158a7e4cbcf44a7f85e5b1df6d5dc30782b215946f9f9b"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161871 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161882 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161889 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161896 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161903 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161910 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161916 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161923 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161930 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.161937 4610 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.163432 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdc9x_03a2c34b-edd9-489b-a8e6-23502cdeb309/kube-multus/1.log" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.163532 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdc9x" event={"ID":"03a2c34b-edd9-489b-a8e6-23502cdeb309","Type":"ContainerStarted","Data":"aba04463e971e789228093c84df90a40656339b504c57b5da6adabe21e952084"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.165337 4610 generic.go:334] "Generic (PLEG): container finished" podID="50589b58-901a-4cb3-bfb8-6bde5b5c3c99" containerID="298be63ef889dd220e209005b6d88be1d712467abc20d764ac4dc02d2eb4a2cc" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.165383 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerDied","Data":"298be63ef889dd220e209005b6d88be1d712467abc20d764ac4dc02d2eb4a2cc"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.165415 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"b2c8870dae798cadf9c12d06827dd647b87ff4fda84573bc2edb7afdada8e8fe"} Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.178796 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.220756 4610 scope.go:117] "RemoveContainer" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.258253 4610 scope.go:117] "RemoveContainer" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.261304 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pqkpj"] Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.268231 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pqkpj"] Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.277385 4610 scope.go:117] "RemoveContainer" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.303429 4610 scope.go:117] "RemoveContainer" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.317400 4610 scope.go:117] "RemoveContainer" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.332618 4610 scope.go:117] "RemoveContainer" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.344677 4610 scope.go:117] "RemoveContainer" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.357397 4610 scope.go:117] "RemoveContainer" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.388165 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.388932 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.388978 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} err="failed to get container status \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.389012 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.390428 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": container with ID starting with 6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea not found: ID does not exist" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.390465 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} err="failed to get container status \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": rpc error: code = NotFound desc = could not find container \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": container with ID starting with 6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.390488 4610 scope.go:117] "RemoveContainer" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.390898 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": container with ID starting with 6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e not found: ID does not exist" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.390921 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} err="failed to get container status \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": rpc error: code = NotFound desc = could not find container \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": container with ID starting with 6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.390936 4610 scope.go:117] "RemoveContainer" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.391373 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": container with ID starting with 33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde not found: ID does not exist" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.391389 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} err="failed to get container status \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": rpc error: code = NotFound desc = could not find container \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": container with ID starting with 33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.391401 4610 scope.go:117] "RemoveContainer" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.391968 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": container with ID starting with 3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb not found: ID does not exist" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.391990 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} err="failed to get container status \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": rpc error: code = NotFound desc = could not find container \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": container with ID starting with 3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.392004 4610 scope.go:117] "RemoveContainer" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.393381 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": container with ID starting with 4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93 not found: ID does not exist" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.393427 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} err="failed to get container status \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": rpc error: code = NotFound desc = could not find container \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": container with ID starting with 4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.393455 4610 scope.go:117] "RemoveContainer" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.393736 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": container with ID starting with edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef not found: ID does not exist" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.393759 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} err="failed to get container status \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": rpc error: code = NotFound desc = could not find container \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": container with ID starting with edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.393775 4610 scope.go:117] "RemoveContainer" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.393971 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": container with ID starting with 447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8 not found: ID does not exist" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.393992 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} err="failed to get container status \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": rpc error: code = NotFound desc = could not find container \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": container with ID starting with 447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.394008 4610 scope.go:117] "RemoveContainer" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.394616 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": container with ID starting with 0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5 not found: ID does not exist" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.394638 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} err="failed to get container status \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": rpc error: code = NotFound desc = could not find container \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": container with ID starting with 0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.394650 4610 scope.go:117] "RemoveContainer" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: E1006 08:52:56.394827 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": container with ID starting with e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4 not found: ID does not exist" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.394845 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} err="failed to get container status \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": rpc error: code = NotFound desc = could not find container \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": container with ID starting with e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.394856 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395009 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} err="failed to get container status \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395021 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395186 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} err="failed to get container status \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": rpc error: code = NotFound desc = could not find container \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": container with ID starting with 6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395202 4610 scope.go:117] "RemoveContainer" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395359 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} err="failed to get container status \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": rpc error: code = NotFound desc = could not find container \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": container with ID starting with 6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395377 4610 scope.go:117] "RemoveContainer" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395577 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} err="failed to get container status \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": rpc error: code = NotFound desc = could not find container \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": container with ID starting with 33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395596 4610 scope.go:117] "RemoveContainer" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395744 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} err="failed to get container status \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": rpc error: code = NotFound desc = could not find container \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": container with ID starting with 3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395759 4610 scope.go:117] "RemoveContainer" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395894 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} err="failed to get container status \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": rpc error: code = NotFound desc = could not find container \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": container with ID starting with 4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.395912 4610 scope.go:117] "RemoveContainer" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396070 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} err="failed to get container status \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": rpc error: code = NotFound desc = could not find container \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": container with ID starting with edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396088 4610 scope.go:117] "RemoveContainer" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396226 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} err="failed to get container status \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": rpc error: code = NotFound desc = could not find container \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": container with ID starting with 447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396243 4610 scope.go:117] "RemoveContainer" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396378 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} err="failed to get container status \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": rpc error: code = NotFound desc = could not find container \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": container with ID starting with 0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396395 4610 scope.go:117] "RemoveContainer" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396537 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} err="failed to get container status \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": rpc error: code = NotFound desc = could not find container \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": container with ID starting with e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396554 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396691 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} err="failed to get container status \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396707 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396848 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} err="failed to get container status \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": rpc error: code = NotFound desc = could not find container \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": container with ID starting with 6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.396864 4610 scope.go:117] "RemoveContainer" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397022 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} err="failed to get container status \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": rpc error: code = NotFound desc = could not find container \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": container with ID starting with 6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397072 4610 scope.go:117] "RemoveContainer" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397256 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} err="failed to get container status \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": rpc error: code = NotFound desc = could not find container \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": container with ID starting with 33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397275 4610 scope.go:117] "RemoveContainer" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397555 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} err="failed to get container status \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": rpc error: code = NotFound desc = could not find container \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": container with ID starting with 3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397573 4610 scope.go:117] "RemoveContainer" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397736 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} err="failed to get container status \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": rpc error: code = NotFound desc = could not find container \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": container with ID starting with 4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397754 4610 scope.go:117] "RemoveContainer" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397900 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} err="failed to get container status \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": rpc error: code = NotFound desc = could not find container \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": container with ID starting with edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.397939 4610 scope.go:117] "RemoveContainer" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398129 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} err="failed to get container status \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": rpc error: code = NotFound desc = could not find container \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": container with ID starting with 447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398147 4610 scope.go:117] "RemoveContainer" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398341 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} err="failed to get container status \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": rpc error: code = NotFound desc = could not find container \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": container with ID starting with 0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398359 4610 scope.go:117] "RemoveContainer" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398526 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} err="failed to get container status \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": rpc error: code = NotFound desc = could not find container \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": container with ID starting with e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398544 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398703 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} err="failed to get container status \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398720 4610 scope.go:117] "RemoveContainer" containerID="6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398857 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea"} err="failed to get container status \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": rpc error: code = NotFound desc = could not find container \"6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea\": container with ID starting with 6ab16be2f6391a78fa3b11a6ac72e036a2e1536e1816cfd75b9d3988a49670ea not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.398875 4610 scope.go:117] "RemoveContainer" containerID="6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399020 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e"} err="failed to get container status \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": rpc error: code = NotFound desc = could not find container \"6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e\": container with ID starting with 6b052514ec18912b79fc2113f7d5e68cd6188ccd63f8022f855fd1376b33d66e not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399036 4610 scope.go:117] "RemoveContainer" containerID="33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399243 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde"} err="failed to get container status \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": rpc error: code = NotFound desc = could not find container \"33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde\": container with ID starting with 33705b413ce1a66bb2b69481561c06d902d9151801b87f3cbdb336b2975c7cde not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399259 4610 scope.go:117] "RemoveContainer" containerID="3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399414 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb"} err="failed to get container status \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": rpc error: code = NotFound desc = could not find container \"3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb\": container with ID starting with 3aeebde15a0eadb19d33e82d1ae07ad09532b409820edd41730bc4697d25e9bb not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399430 4610 scope.go:117] "RemoveContainer" containerID="4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399627 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93"} err="failed to get container status \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": rpc error: code = NotFound desc = could not find container \"4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93\": container with ID starting with 4fb18459dac47d91985e21dc1f133c98213433cf17e8ac0150226e1ba16e2b93 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399645 4610 scope.go:117] "RemoveContainer" containerID="edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399810 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef"} err="failed to get container status \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": rpc error: code = NotFound desc = could not find container \"edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef\": container with ID starting with edbc5fc42ba0ae3538bb373659a8f0cc2e9de049851009940f7b1012b53e41ef not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.399828 4610 scope.go:117] "RemoveContainer" containerID="447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400011 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8"} err="failed to get container status \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": rpc error: code = NotFound desc = could not find container \"447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8\": container with ID starting with 447b97c698e2580b14d8860c2d2b322637369f87ac26c509dd3033f500c780a8 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400029 4610 scope.go:117] "RemoveContainer" containerID="0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400290 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5"} err="failed to get container status \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": rpc error: code = NotFound desc = could not find container \"0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5\": container with ID starting with 0fbc54de4069bd9ee1063925db1cf15f9006ed69c0a2477bf726a9ac75b0d0d5 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400308 4610 scope.go:117] "RemoveContainer" containerID="e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400483 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4"} err="failed to get container status \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": rpc error: code = NotFound desc = could not find container \"e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4\": container with ID starting with e214b2686eadb3bb8f4729f8c6b83e130f3366eba8fc935de5a9804c45de75e4 not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400500 4610 scope.go:117] "RemoveContainer" containerID="61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc" Oct 06 08:52:56 crc kubenswrapper[4610]: I1006 08:52:56.400655 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc"} err="failed to get container status \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": rpc error: code = NotFound desc = could not find container \"61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc\": container with ID starting with 61661b3661eaf1d0c273839b818ec22ad93315b790b6bc93e5abc526db657ccc not found: ID does not exist" Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.076101 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980266ef-4c63-4532-8b33-25fa1c57a9a7" path="/var/lib/kubelet/pods/980266ef-4c63-4532-8b33-25fa1c57a9a7/volumes" Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173372 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"3c19091c941d5ea0ae54dbe5c5c3c63af81d75b42bee155c3f0d12c52fcba4ca"} Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173407 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"0be58d8d99c55d705322736a686047bf2c79d42b60c64d5389775b345b9254c9"} Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173417 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"d04d1f04e90e386e3ea36de12a567ba4e1b08cd0c5efb79df1ac7e74f317fc9d"} Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173425 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"e24baf4a753c9b8419b10cf9f21480a68315c477ef25da9839a3f3be2d30b8b6"} Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173434 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"92ba0367f8c069eb5a99803d52b9500892aa28376f130fc381b5bdb99ceaa93c"} Oct 06 08:52:57 crc kubenswrapper[4610]: I1006 08:52:57.173442 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"f19cd53404b20c6e0ff6c2c4ad2b12d1cb7de177787fb71b2a020952f4e5a6e4"} Oct 06 08:52:59 crc kubenswrapper[4610]: I1006 08:52:59.185430 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"6170a8336e53044e5c0d67707d372740e3a8f041b9293056c9fd26ae83cd885e"} Oct 06 08:53:00 crc kubenswrapper[4610]: I1006 08:53:00.998696 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:53:00 crc kubenswrapper[4610]: I1006 08:53:00.999180 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" containerID="cri-o://033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569" gracePeriod=30 Oct 06 08:53:01 crc kubenswrapper[4610]: I1006 08:53:01.095406 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:53:01 crc kubenswrapper[4610]: I1006 08:53:01.095623 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" containerID="cri-o://0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520" gracePeriod=30 Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.203242 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" event={"ID":"50589b58-901a-4cb3-bfb8-6bde5b5c3c99","Type":"ContainerStarted","Data":"482624e95b948b08a940f274dcd1d9981347fb1f9a30ff15436d68609b7a05f5"} Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.204475 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.204503 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.204552 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.234140 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" podStartSLOduration=7.234125546 podStartE2EDuration="7.234125546s" podCreationTimestamp="2025-10-06 08:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:53:02.231887646 +0000 UTC m=+713.946941054" watchObservedRunningTime="2025-10-06 08:53:02.234125546 +0000 UTC m=+713.949178934" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.240649 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.245457 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.583031 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.607682 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw"] Oct 06 08:53:02 crc kubenswrapper[4610]: E1006 08:53:02.607868 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.607878 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.607963 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" containerName="route-controller-manager" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.608356 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.668625 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw"] Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.678388 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca\") pod \"5e6695a0-e257-46a6-9459-7b476baa633b\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.678667 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6vpf\" (UniqueName: \"kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf\") pod \"5e6695a0-e257-46a6-9459-7b476baa633b\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.678750 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config\") pod \"5e6695a0-e257-46a6-9459-7b476baa633b\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.678833 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert\") pod \"5e6695a0-e257-46a6-9459-7b476baa633b\" (UID: \"5e6695a0-e257-46a6-9459-7b476baa633b\") " Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679034 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-client-ca\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679146 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-config\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679227 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hntr\" (UniqueName: \"kubernetes.io/projected/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-kube-api-access-5hntr\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679339 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-serving-cert\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679490 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca" (OuterVolumeSpecName: "client-ca") pod "5e6695a0-e257-46a6-9459-7b476baa633b" (UID: "5e6695a0-e257-46a6-9459-7b476baa633b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.679993 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config" (OuterVolumeSpecName: "config") pod "5e6695a0-e257-46a6-9459-7b476baa633b" (UID: "5e6695a0-e257-46a6-9459-7b476baa633b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.685010 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf" (OuterVolumeSpecName: "kube-api-access-b6vpf") pod "5e6695a0-e257-46a6-9459-7b476baa633b" (UID: "5e6695a0-e257-46a6-9459-7b476baa633b"). InnerVolumeSpecName "kube-api-access-b6vpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.696703 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5e6695a0-e257-46a6-9459-7b476baa633b" (UID: "5e6695a0-e257-46a6-9459-7b476baa633b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780762 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-serving-cert\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780853 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-client-ca\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780878 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-config\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780899 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hntr\" (UniqueName: \"kubernetes.io/projected/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-kube-api-access-5hntr\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780967 4610 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780978 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6695a0-e257-46a6-9459-7b476baa633b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.780988 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6vpf\" (UniqueName: \"kubernetes.io/projected/5e6695a0-e257-46a6-9459-7b476baa633b-kube-api-access-b6vpf\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.781000 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e6695a0-e257-46a6-9459-7b476baa633b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.782380 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-client-ca\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.782477 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-config\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.784008 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-serving-cert\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.802851 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hntr\" (UniqueName: \"kubernetes.io/projected/f6f2c24d-5f7b-403a-8943-37b21a3e14f6-kube-api-access-5hntr\") pod \"route-controller-manager-7dfcff8884-9sckw\" (UID: \"f6f2c24d-5f7b-403a-8943-37b21a3e14f6\") " pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: I1006 08:53:02.927910 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: E1006 08:53:02.950815 4610 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(0b5899e890262fe5ada441592fb8a0ecf7ddfe1003f04aba1532b09c65cadf22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:53:02 crc kubenswrapper[4610]: E1006 08:53:02.951039 4610 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(0b5899e890262fe5ada441592fb8a0ecf7ddfe1003f04aba1532b09c65cadf22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: E1006 08:53:02.951165 4610 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(0b5899e890262fe5ada441592fb8a0ecf7ddfe1003f04aba1532b09c65cadf22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:02 crc kubenswrapper[4610]: E1006 08:53:02.951273 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager(f6f2c24d-5f7b-403a-8943-37b21a3e14f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager(f6f2c24d-5f7b-403a-8943-37b21a3e14f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(0b5899e890262fe5ada441592fb8a0ecf7ddfe1003f04aba1532b09c65cadf22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" podUID="f6f2c24d-5f7b-403a-8943-37b21a3e14f6" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.020989 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.084752 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bt6z\" (UniqueName: \"kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z\") pod \"1293a8cf-7266-4bf1-bc49-b8369656484b\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.084902 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config\") pod \"1293a8cf-7266-4bf1-bc49-b8369656484b\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.085803 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config" (OuterVolumeSpecName: "config") pod "1293a8cf-7266-4bf1-bc49-b8369656484b" (UID: "1293a8cf-7266-4bf1-bc49-b8369656484b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.087227 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1293a8cf-7266-4bf1-bc49-b8369656484b" (UID: "1293a8cf-7266-4bf1-bc49-b8369656484b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.087285 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles\") pod \"1293a8cf-7266-4bf1-bc49-b8369656484b\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.087423 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert\") pod \"1293a8cf-7266-4bf1-bc49-b8369656484b\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.087865 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z" (OuterVolumeSpecName: "kube-api-access-8bt6z") pod "1293a8cf-7266-4bf1-bc49-b8369656484b" (UID: "1293a8cf-7266-4bf1-bc49-b8369656484b"). InnerVolumeSpecName "kube-api-access-8bt6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.087959 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca\") pod \"1293a8cf-7266-4bf1-bc49-b8369656484b\" (UID: \"1293a8cf-7266-4bf1-bc49-b8369656484b\") " Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.088416 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1293a8cf-7266-4bf1-bc49-b8369656484b" (UID: "1293a8cf-7266-4bf1-bc49-b8369656484b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.088516 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bt6z\" (UniqueName: \"kubernetes.io/projected/1293a8cf-7266-4bf1-bc49-b8369656484b-kube-api-access-8bt6z\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.088539 4610 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.088555 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.090908 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1293a8cf-7266-4bf1-bc49-b8369656484b" (UID: "1293a8cf-7266-4bf1-bc49-b8369656484b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.189823 4610 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1293a8cf-7266-4bf1-bc49-b8369656484b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.189858 4610 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1293a8cf-7266-4bf1-bc49-b8369656484b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.208208 4610 generic.go:334] "Generic (PLEG): container finished" podID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerID="033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569" exitCode=0 Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.208276 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.208317 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" event={"ID":"1293a8cf-7266-4bf1-bc49-b8369656484b","Type":"ContainerDied","Data":"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569"} Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.208345 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7qsjg" event={"ID":"1293a8cf-7266-4bf1-bc49-b8369656484b","Type":"ContainerDied","Data":"732c42b4e15b2538e79b4c43c43939875308f87fe29d413c9b682580b89c9f1c"} Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.208365 4610 scope.go:117] "RemoveContainer" containerID="033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.210188 4610 generic.go:334] "Generic (PLEG): container finished" podID="5e6695a0-e257-46a6-9459-7b476baa633b" containerID="0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520" exitCode=0 Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.211271 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.211658 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" event={"ID":"5e6695a0-e257-46a6-9459-7b476baa633b","Type":"ContainerDied","Data":"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520"} Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.211687 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk" event={"ID":"5e6695a0-e257-46a6-9459-7b476baa633b","Type":"ContainerDied","Data":"8e9dbfa2fa4b9d97c89acdb812ae405be06dacb0d8a855d100e3126d77a4e2a8"} Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.211718 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.211911 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.242304 4610 scope.go:117] "RemoveContainer" containerID="033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.244962 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.245014 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569\": container with ID starting with 033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569 not found: ID does not exist" containerID="033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.245058 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569"} err="failed to get container status \"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569\": rpc error: code = NotFound desc = could not find container \"033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569\": container with ID starting with 033b3f3694571454fc6d63b34ac74773451f597b53e33c02cd9bcc91a64d4569 not found: ID does not exist" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.245083 4610 scope.go:117] "RemoveContainer" containerID="0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.250014 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2ktvk"] Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.261301 4610 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(f52219b97a9ea4f5c99a8ad6654654b8bb0eb6680d2f4084dc766af2a4c2e178): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.261385 4610 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(f52219b97a9ea4f5c99a8ad6654654b8bb0eb6680d2f4084dc766af2a4c2e178): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.261420 4610 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(f52219b97a9ea4f5c99a8ad6654654b8bb0eb6680d2f4084dc766af2a4c2e178): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.261461 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager(f6f2c24d-5f7b-403a-8943-37b21a3e14f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager(f6f2c24d-5f7b-403a-8943-37b21a3e14f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7dfcff8884-9sckw_openshift-route-controller-manager_f6f2c24d-5f7b-403a-8943-37b21a3e14f6_0(f52219b97a9ea4f5c99a8ad6654654b8bb0eb6680d2f4084dc766af2a4c2e178): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" podUID="f6f2c24d-5f7b-403a-8943-37b21a3e14f6" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.264868 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.266253 4610 scope.go:117] "RemoveContainer" containerID="0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520" Oct 06 08:53:03 crc kubenswrapper[4610]: E1006 08:53:03.270232 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520\": container with ID starting with 0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520 not found: ID does not exist" containerID="0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.270305 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520"} err="failed to get container status \"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520\": rpc error: code = NotFound desc = could not find container \"0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520\": container with ID starting with 0df303ce988c3eddf1d5873f04aa44dc9caf58a82ea16e9e116627ef2e2b3520 not found: ID does not exist" Oct 06 08:53:03 crc kubenswrapper[4610]: I1006 08:53:03.273520 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7qsjg"] Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.076554 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" path="/var/lib/kubelet/pods/1293a8cf-7266-4bf1-bc49-b8369656484b/volumes" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.077171 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6695a0-e257-46a6-9459-7b476baa633b" path="/var/lib/kubelet/pods/5e6695a0-e257-46a6-9459-7b476baa633b/volumes" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.529763 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c46cd45c8-snrvw"] Oct 06 08:53:05 crc kubenswrapper[4610]: E1006 08:53:05.530365 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.530395 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.530558 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293a8cf-7266-4bf1-bc49-b8369656484b" containerName="controller-manager" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.531087 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.533272 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.533423 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.533490 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.535660 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.535886 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.537346 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.548745 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c46cd45c8-snrvw"] Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.562668 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.619272 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-proxy-ca-bundles\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.619385 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-config\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.619411 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-client-ca\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.619435 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzgz\" (UniqueName: \"kubernetes.io/projected/3a3ebf09-4649-4057-981d-4d24b72b2d1d-kube-api-access-kwzgz\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.619691 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3ebf09-4649-4057-981d-4d24b72b2d1d-serving-cert\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.720402 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3ebf09-4649-4057-981d-4d24b72b2d1d-serving-cert\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.720470 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-proxy-ca-bundles\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.720507 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-config\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.720528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-client-ca\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.720552 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzgz\" (UniqueName: \"kubernetes.io/projected/3a3ebf09-4649-4057-981d-4d24b72b2d1d-kube-api-access-kwzgz\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.721666 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-proxy-ca-bundles\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.722001 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-client-ca\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.722492 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a3ebf09-4649-4057-981d-4d24b72b2d1d-config\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.728199 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3ebf09-4649-4057-981d-4d24b72b2d1d-serving-cert\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.748791 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzgz\" (UniqueName: \"kubernetes.io/projected/3a3ebf09-4649-4057-981d-4d24b72b2d1d-kube-api-access-kwzgz\") pod \"controller-manager-c46cd45c8-snrvw\" (UID: \"3a3ebf09-4649-4057-981d-4d24b72b2d1d\") " pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:05 crc kubenswrapper[4610]: I1006 08:53:05.859954 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.071753 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c46cd45c8-snrvw"] Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.229932 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" event={"ID":"3a3ebf09-4649-4057-981d-4d24b72b2d1d","Type":"ContainerStarted","Data":"8049890d8dfa25eb39e210399420cab4a320fcd21f9a672b8217c636bae45c0f"} Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.229976 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" event={"ID":"3a3ebf09-4649-4057-981d-4d24b72b2d1d","Type":"ContainerStarted","Data":"83dc07d1f8bb0c6385381d370215f3227ef83120f72601173a5bbf1b8e8c64f0"} Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.230864 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.231904 4610 patch_prober.go:28] interesting pod/controller-manager-c46cd45c8-snrvw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 06 08:53:06 crc kubenswrapper[4610]: I1006 08:53:06.231952 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" podUID="3a3ebf09-4649-4057-981d-4d24b72b2d1d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 06 08:53:07 crc kubenswrapper[4610]: I1006 08:53:07.237998 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" Oct 06 08:53:07 crc kubenswrapper[4610]: I1006 08:53:07.255880 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c46cd45c8-snrvw" podStartSLOduration=6.255859472 podStartE2EDuration="6.255859472s" podCreationTimestamp="2025-10-06 08:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:53:06.254658142 +0000 UTC m=+717.969711560" watchObservedRunningTime="2025-10-06 08:53:07.255859472 +0000 UTC m=+718.970912860" Oct 06 08:53:07 crc kubenswrapper[4610]: I1006 08:53:07.315949 4610 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:53:16 crc kubenswrapper[4610]: I1006 08:53:16.070086 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:16 crc kubenswrapper[4610]: I1006 08:53:16.071370 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:16 crc kubenswrapper[4610]: I1006 08:53:16.469345 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:53:16 crc kubenswrapper[4610]: I1006 08:53:16.469744 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:53:16 crc kubenswrapper[4610]: I1006 08:53:16.497954 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw"] Oct 06 08:53:16 crc kubenswrapper[4610]: W1006 08:53:16.512180 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f2c24d_5f7b_403a_8943_37b21a3e14f6.slice/crio-95759d06247f26df411bf301ba7572c990e9c2bec6b4afa93cc8e0340cc29828 WatchSource:0}: Error finding container 95759d06247f26df411bf301ba7572c990e9c2bec6b4afa93cc8e0340cc29828: Status 404 returned error can't find the container with id 95759d06247f26df411bf301ba7572c990e9c2bec6b4afa93cc8e0340cc29828 Oct 06 08:53:17 crc kubenswrapper[4610]: I1006 08:53:17.295404 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" event={"ID":"f6f2c24d-5f7b-403a-8943-37b21a3e14f6","Type":"ContainerStarted","Data":"9485ddb7aa8982e176dab45f77be392c5d72f513aee4a3df4d414c23b0034c8b"} Oct 06 08:53:17 crc kubenswrapper[4610]: I1006 08:53:17.295697 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" event={"ID":"f6f2c24d-5f7b-403a-8943-37b21a3e14f6","Type":"ContainerStarted","Data":"95759d06247f26df411bf301ba7572c990e9c2bec6b4afa93cc8e0340cc29828"} Oct 06 08:53:17 crc kubenswrapper[4610]: I1006 08:53:17.296028 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:17 crc kubenswrapper[4610]: I1006 08:53:17.316469 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" podStartSLOduration=16.31645002 podStartE2EDuration="16.31645002s" podCreationTimestamp="2025-10-06 08:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:53:17.313549131 +0000 UTC m=+729.028602529" watchObservedRunningTime="2025-10-06 08:53:17.31645002 +0000 UTC m=+729.031503408" Oct 06 08:53:17 crc kubenswrapper[4610]: I1006 08:53:17.397759 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dfcff8884-9sckw" Oct 06 08:53:25 crc kubenswrapper[4610]: I1006 08:53:25.816402 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ds7hf" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.708995 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt"] Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.711164 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.714036 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.723539 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt"] Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.745361 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.745412 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.745446 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgp5b\" (UniqueName: \"kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.846739 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.847076 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.847210 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgp5b\" (UniqueName: \"kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.847290 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.847650 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:38 crc kubenswrapper[4610]: I1006 08:53:38.868988 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgp5b\" (UniqueName: \"kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:39 crc kubenswrapper[4610]: I1006 08:53:39.026356 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:39 crc kubenswrapper[4610]: I1006 08:53:39.426809 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt"] Oct 06 08:53:40 crc kubenswrapper[4610]: I1006 08:53:40.435496 4610 generic.go:334] "Generic (PLEG): container finished" podID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerID="1d69b3133aefc23c62af5566aedd54272aed9b11cec7ae82da070c4e6b085ffb" exitCode=0 Oct 06 08:53:40 crc kubenswrapper[4610]: I1006 08:53:40.435778 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" event={"ID":"97e4f094-7f15-4140-b0dd-10f545a9fef3","Type":"ContainerDied","Data":"1d69b3133aefc23c62af5566aedd54272aed9b11cec7ae82da070c4e6b085ffb"} Oct 06 08:53:40 crc kubenswrapper[4610]: I1006 08:53:40.438124 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" event={"ID":"97e4f094-7f15-4140-b0dd-10f545a9fef3","Type":"ContainerStarted","Data":"790403d9be096599fe8ee493259c8c132a561261af017c37d63c40a59218f04e"} Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.076980 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.078386 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.109009 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.181922 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.181979 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.182014 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qctxr\" (UniqueName: \"kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.283406 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.283460 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.283500 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qctxr\" (UniqueName: \"kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.284368 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.284639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.308308 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qctxr\" (UniqueName: \"kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr\") pod \"redhat-operators-lvk5m\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.405515 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:41 crc kubenswrapper[4610]: I1006 08:53:41.853433 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:41 crc kubenswrapper[4610]: W1006 08:53:41.859725 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f31c0f_1354_4048_ad5e_961c12a71ca0.slice/crio-d127315aa283cce857c69824e7c039aadf51e2c7a80a96083e2b6ff152ecb703 WatchSource:0}: Error finding container d127315aa283cce857c69824e7c039aadf51e2c7a80a96083e2b6ff152ecb703: Status 404 returned error can't find the container with id d127315aa283cce857c69824e7c039aadf51e2c7a80a96083e2b6ff152ecb703 Oct 06 08:53:42 crc kubenswrapper[4610]: I1006 08:53:42.451308 4610 generic.go:334] "Generic (PLEG): container finished" podID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerID="840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a" exitCode=0 Oct 06 08:53:42 crc kubenswrapper[4610]: I1006 08:53:42.451404 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerDied","Data":"840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a"} Oct 06 08:53:42 crc kubenswrapper[4610]: I1006 08:53:42.451433 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerStarted","Data":"d127315aa283cce857c69824e7c039aadf51e2c7a80a96083e2b6ff152ecb703"} Oct 06 08:53:42 crc kubenswrapper[4610]: I1006 08:53:42.453511 4610 generic.go:334] "Generic (PLEG): container finished" podID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerID="87fd4dcb52d7149d88a3c3ca2492897f803cbffd5e3fac219d2d0a550aeca36c" exitCode=0 Oct 06 08:53:42 crc kubenswrapper[4610]: I1006 08:53:42.453532 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" event={"ID":"97e4f094-7f15-4140-b0dd-10f545a9fef3","Type":"ContainerDied","Data":"87fd4dcb52d7149d88a3c3ca2492897f803cbffd5e3fac219d2d0a550aeca36c"} Oct 06 08:53:43 crc kubenswrapper[4610]: I1006 08:53:43.463510 4610 generic.go:334] "Generic (PLEG): container finished" podID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerID="807f1532c5882e947dc13c313bc0603e046175c595fe12b5b56c307ca1fce063" exitCode=0 Oct 06 08:53:43 crc kubenswrapper[4610]: I1006 08:53:43.463610 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" event={"ID":"97e4f094-7f15-4140-b0dd-10f545a9fef3","Type":"ContainerDied","Data":"807f1532c5882e947dc13c313bc0603e046175c595fe12b5b56c307ca1fce063"} Oct 06 08:53:43 crc kubenswrapper[4610]: I1006 08:53:43.465821 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerStarted","Data":"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69"} Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.477107 4610 generic.go:334] "Generic (PLEG): container finished" podID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerID="e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69" exitCode=0 Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.477238 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerDied","Data":"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69"} Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.731977 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.823547 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle\") pod \"97e4f094-7f15-4140-b0dd-10f545a9fef3\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.823623 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util\") pod \"97e4f094-7f15-4140-b0dd-10f545a9fef3\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.823664 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgp5b\" (UniqueName: \"kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b\") pod \"97e4f094-7f15-4140-b0dd-10f545a9fef3\" (UID: \"97e4f094-7f15-4140-b0dd-10f545a9fef3\") " Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.825131 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle" (OuterVolumeSpecName: "bundle") pod "97e4f094-7f15-4140-b0dd-10f545a9fef3" (UID: "97e4f094-7f15-4140-b0dd-10f545a9fef3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.830199 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b" (OuterVolumeSpecName: "kube-api-access-pgp5b") pod "97e4f094-7f15-4140-b0dd-10f545a9fef3" (UID: "97e4f094-7f15-4140-b0dd-10f545a9fef3"). InnerVolumeSpecName "kube-api-access-pgp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.925731 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgp5b\" (UniqueName: \"kubernetes.io/projected/97e4f094-7f15-4140-b0dd-10f545a9fef3-kube-api-access-pgp5b\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:44 crc kubenswrapper[4610]: I1006 08:53:44.925798 4610 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.110227 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util" (OuterVolumeSpecName: "util") pod "97e4f094-7f15-4140-b0dd-10f545a9fef3" (UID: "97e4f094-7f15-4140-b0dd-10f545a9fef3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.129085 4610 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97e4f094-7f15-4140-b0dd-10f545a9fef3-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.484478 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" event={"ID":"97e4f094-7f15-4140-b0dd-10f545a9fef3","Type":"ContainerDied","Data":"790403d9be096599fe8ee493259c8c132a561261af017c37d63c40a59218f04e"} Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.484524 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="790403d9be096599fe8ee493259c8c132a561261af017c37d63c40a59218f04e" Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.484596 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt" Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.488142 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerStarted","Data":"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221"} Oct 06 08:53:45 crc kubenswrapper[4610]: I1006 08:53:45.768544 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lvk5m" podStartSLOduration=2.074470382 podStartE2EDuration="4.768522076s" podCreationTimestamp="2025-10-06 08:53:41 +0000 UTC" firstStartedPulling="2025-10-06 08:53:42.453499309 +0000 UTC m=+754.168552697" lastFinishedPulling="2025-10-06 08:53:45.147550993 +0000 UTC m=+756.862604391" observedRunningTime="2025-10-06 08:53:45.509682095 +0000 UTC m=+757.224735563" watchObservedRunningTime="2025-10-06 08:53:45.768522076 +0000 UTC m=+757.483575484" Oct 06 08:53:46 crc kubenswrapper[4610]: I1006 08:53:46.468889 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:53:46 crc kubenswrapper[4610]: I1006 08:53:46.469412 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.322426 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l"] Oct 06 08:53:49 crc kubenswrapper[4610]: E1006 08:53:49.322922 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="extract" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.322937 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="extract" Oct 06 08:53:49 crc kubenswrapper[4610]: E1006 08:53:49.322948 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="util" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.322956 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="util" Oct 06 08:53:49 crc kubenswrapper[4610]: E1006 08:53:49.322966 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="pull" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.322976 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="pull" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.323108 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e4f094-7f15-4140-b0dd-10f545a9fef3" containerName="extract" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.323523 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.325422 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mk6ld" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.325737 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.331191 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.339233 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l"] Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.473931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t2hb\" (UniqueName: \"kubernetes.io/projected/8ceb9be4-5b44-4da2-adb3-fcfca400d23a-kube-api-access-2t2hb\") pod \"nmstate-operator-858ddd8f98-bzk4l\" (UID: \"8ceb9be4-5b44-4da2-adb3-fcfca400d23a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.576561 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t2hb\" (UniqueName: \"kubernetes.io/projected/8ceb9be4-5b44-4da2-adb3-fcfca400d23a-kube-api-access-2t2hb\") pod \"nmstate-operator-858ddd8f98-bzk4l\" (UID: \"8ceb9be4-5b44-4da2-adb3-fcfca400d23a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.599089 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t2hb\" (UniqueName: \"kubernetes.io/projected/8ceb9be4-5b44-4da2-adb3-fcfca400d23a-kube-api-access-2t2hb\") pod \"nmstate-operator-858ddd8f98-bzk4l\" (UID: \"8ceb9be4-5b44-4da2-adb3-fcfca400d23a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.640540 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" Oct 06 08:53:49 crc kubenswrapper[4610]: I1006 08:53:49.900021 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l"] Oct 06 08:53:50 crc kubenswrapper[4610]: I1006 08:53:50.515172 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" event={"ID":"8ceb9be4-5b44-4da2-adb3-fcfca400d23a","Type":"ContainerStarted","Data":"59f68ecf404a4fda98ee94e4f759da641a67f93b6e2e14a6bd916f779203ea8a"} Oct 06 08:53:51 crc kubenswrapper[4610]: I1006 08:53:51.406165 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:51 crc kubenswrapper[4610]: I1006 08:53:51.406478 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:51 crc kubenswrapper[4610]: I1006 08:53:51.452675 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:51 crc kubenswrapper[4610]: I1006 08:53:51.556274 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:53 crc kubenswrapper[4610]: I1006 08:53:53.532160 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" event={"ID":"8ceb9be4-5b44-4da2-adb3-fcfca400d23a","Type":"ContainerStarted","Data":"3ecb9919e5cbbb69f01a72e368e64675b3be267058f26e9dd9d29a2fa001a768"} Oct 06 08:53:53 crc kubenswrapper[4610]: I1006 08:53:53.550529 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bzk4l" podStartSLOduration=1.559557443 podStartE2EDuration="4.550507087s" podCreationTimestamp="2025-10-06 08:53:49 +0000 UTC" firstStartedPulling="2025-10-06 08:53:49.925124732 +0000 UTC m=+761.640178120" lastFinishedPulling="2025-10-06 08:53:52.916074376 +0000 UTC m=+764.631127764" observedRunningTime="2025-10-06 08:53:53.549638613 +0000 UTC m=+765.264692041" watchObservedRunningTime="2025-10-06 08:53:53.550507087 +0000 UTC m=+765.265560475" Oct 06 08:53:53 crc kubenswrapper[4610]: I1006 08:53:53.859578 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:53 crc kubenswrapper[4610]: I1006 08:53:53.859808 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lvk5m" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="registry-server" containerID="cri-o://8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221" gracePeriod=2 Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.364841 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.547917 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content\") pod \"17f31c0f-1354-4048-ad5e-961c12a71ca0\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.547998 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qctxr\" (UniqueName: \"kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr\") pod \"17f31c0f-1354-4048-ad5e-961c12a71ca0\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.548030 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities\") pod \"17f31c0f-1354-4048-ad5e-961c12a71ca0\" (UID: \"17f31c0f-1354-4048-ad5e-961c12a71ca0\") " Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.549342 4610 generic.go:334] "Generic (PLEG): container finished" podID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerID="8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221" exitCode=0 Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.549415 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvk5m" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.549446 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerDied","Data":"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221"} Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.549484 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvk5m" event={"ID":"17f31c0f-1354-4048-ad5e-961c12a71ca0","Type":"ContainerDied","Data":"d127315aa283cce857c69824e7c039aadf51e2c7a80a96083e2b6ff152ecb703"} Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.549502 4610 scope.go:117] "RemoveContainer" containerID="8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.550143 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities" (OuterVolumeSpecName: "utilities") pod "17f31c0f-1354-4048-ad5e-961c12a71ca0" (UID: "17f31c0f-1354-4048-ad5e-961c12a71ca0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.568088 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr" (OuterVolumeSpecName: "kube-api-access-qctxr") pod "17f31c0f-1354-4048-ad5e-961c12a71ca0" (UID: "17f31c0f-1354-4048-ad5e-961c12a71ca0"). InnerVolumeSpecName "kube-api-access-qctxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.586739 4610 scope.go:117] "RemoveContainer" containerID="e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.604241 4610 scope.go:117] "RemoveContainer" containerID="840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.622546 4610 scope.go:117] "RemoveContainer" containerID="8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221" Oct 06 08:53:54 crc kubenswrapper[4610]: E1006 08:53:54.623143 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221\": container with ID starting with 8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221 not found: ID does not exist" containerID="8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.623237 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221"} err="failed to get container status \"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221\": rpc error: code = NotFound desc = could not find container \"8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221\": container with ID starting with 8928aa50ad45c1762ef33978e7c7115b4cd092fe595cea7b1cecc4b4906e8221 not found: ID does not exist" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.623319 4610 scope.go:117] "RemoveContainer" containerID="e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69" Oct 06 08:53:54 crc kubenswrapper[4610]: E1006 08:53:54.623715 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69\": container with ID starting with e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69 not found: ID does not exist" containerID="e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.623847 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69"} err="failed to get container status \"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69\": rpc error: code = NotFound desc = could not find container \"e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69\": container with ID starting with e880d68a219f3158372093ce85217d0fd1f2a4c5c09793fdc88974ddea518d69 not found: ID does not exist" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.623952 4610 scope.go:117] "RemoveContainer" containerID="840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a" Oct 06 08:53:54 crc kubenswrapper[4610]: E1006 08:53:54.624449 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a\": container with ID starting with 840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a not found: ID does not exist" containerID="840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.624488 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a"} err="failed to get container status \"840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a\": rpc error: code = NotFound desc = could not find container \"840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a\": container with ID starting with 840ca3f886bcafa5c6c30bbd110397e46e5b5f34c3c418da9354cc8ace54b68a not found: ID does not exist" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.649597 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qctxr\" (UniqueName: \"kubernetes.io/projected/17f31c0f-1354-4048-ad5e-961c12a71ca0-kube-api-access-qctxr\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:54 crc kubenswrapper[4610]: I1006 08:53:54.649776 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:55 crc kubenswrapper[4610]: I1006 08:53:55.590949 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17f31c0f-1354-4048-ad5e-961c12a71ca0" (UID: "17f31c0f-1354-4048-ad5e-961c12a71ca0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:53:55 crc kubenswrapper[4610]: I1006 08:53:55.662747 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f31c0f-1354-4048-ad5e-961c12a71ca0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:53:55 crc kubenswrapper[4610]: I1006 08:53:55.782996 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:55 crc kubenswrapper[4610]: I1006 08:53:55.788053 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lvk5m"] Oct 06 08:53:57 crc kubenswrapper[4610]: I1006 08:53:57.082718 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" path="/var/lib/kubelet/pods/17f31c0f-1354-4048-ad5e-961c12a71ca0/volumes" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.439799 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k"] Oct 06 08:53:59 crc kubenswrapper[4610]: E1006 08:53:59.440032 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="extract-content" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.440059 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="extract-content" Oct 06 08:53:59 crc kubenswrapper[4610]: E1006 08:53:59.440072 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="registry-server" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.440081 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="registry-server" Oct 06 08:53:59 crc kubenswrapper[4610]: E1006 08:53:59.440107 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="extract-utilities" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.440115 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="extract-utilities" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.440234 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f31c0f-1354-4048-ad5e-961c12a71ca0" containerName="registry-server" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.440916 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.442629 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lqhh6" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.467529 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-47k9v"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.468243 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.473608 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.501639 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.502526 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.509722 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.529528 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.594349 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.594970 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.597464 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.598368 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8gvpk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.600319 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610594 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-nmstate-lock\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610634 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmzpm\" (UniqueName: \"kubernetes.io/projected/bd9ce7eb-b1fc-4636-93fc-d007702a746f-kube-api-access-fmzpm\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610656 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ps7\" (UniqueName: \"kubernetes.io/projected/77dcdec2-c766-467b-a369-11ca28c22ae7-kube-api-access-28ps7\") pod \"nmstate-metrics-fdff9cb8d-4x98k\" (UID: \"77dcdec2-c766-467b-a369-11ca28c22ae7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610675 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/340ebace-99cf-4a2b-aaef-975b3480a795-kube-api-access-nmv9q\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610696 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-ovs-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610736 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-dbus-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.610768 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/340ebace-99cf-4a2b-aaef-975b3480a795-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.618658 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712149 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/340ebace-99cf-4a2b-aaef-975b3480a795-kube-api-access-nmv9q\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712218 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-ovs-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712284 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-dbus-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712310 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/9590cbb8-dcf7-4c56-a984-028943b510d5-kube-api-access-cmj92\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712340 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9590cbb8-dcf7-4c56-a984-028943b510d5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712365 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712389 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/340ebace-99cf-4a2b-aaef-975b3480a795-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712401 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-ovs-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712457 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-nmstate-lock\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712426 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-nmstate-lock\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmzpm\" (UniqueName: \"kubernetes.io/projected/bd9ce7eb-b1fc-4636-93fc-d007702a746f-kube-api-access-fmzpm\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712600 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ps7\" (UniqueName: \"kubernetes.io/projected/77dcdec2-c766-467b-a369-11ca28c22ae7-kube-api-access-28ps7\") pod \"nmstate-metrics-fdff9cb8d-4x98k\" (UID: \"77dcdec2-c766-467b-a369-11ca28c22ae7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.712692 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bd9ce7eb-b1fc-4636-93fc-d007702a746f-dbus-socket\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.721113 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/340ebace-99cf-4a2b-aaef-975b3480a795-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.733329 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ps7\" (UniqueName: \"kubernetes.io/projected/77dcdec2-c766-467b-a369-11ca28c22ae7-kube-api-access-28ps7\") pod \"nmstate-metrics-fdff9cb8d-4x98k\" (UID: \"77dcdec2-c766-467b-a369-11ca28c22ae7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.748562 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmzpm\" (UniqueName: \"kubernetes.io/projected/bd9ce7eb-b1fc-4636-93fc-d007702a746f-kube-api-access-fmzpm\") pod \"nmstate-handler-47k9v\" (UID: \"bd9ce7eb-b1fc-4636-93fc-d007702a746f\") " pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.757856 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.760256 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/340ebace-99cf-4a2b-aaef-975b3480a795-kube-api-access-nmv9q\") pod \"nmstate-webhook-6cdbc54649-m8xhq\" (UID: \"340ebace-99cf-4a2b-aaef-975b3480a795\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.791637 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.803418 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7546cdc554-bzqrd"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.808148 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.813686 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/9590cbb8-dcf7-4c56-a984-028943b510d5-kube-api-access-cmj92\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.813733 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9590cbb8-dcf7-4c56-a984-028943b510d5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.813763 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: E1006 08:53:59.814437 4610 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 06 08:53:59 crc kubenswrapper[4610]: E1006 08:53:59.814493 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert podName:9590cbb8-dcf7-4c56-a984-028943b510d5 nodeName:}" failed. No retries permitted until 2025-10-06 08:54:00.31447648 +0000 UTC m=+772.029529868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-tjvnk" (UID: "9590cbb8-dcf7-4c56-a984-028943b510d5") : secret "plugin-serving-cert" not found Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.815169 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9590cbb8-dcf7-4c56-a984-028943b510d5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.826384 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.830557 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7546cdc554-bzqrd"] Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.852756 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/9590cbb8-dcf7-4c56-a984-028943b510d5-kube-api-access-cmj92\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920067 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6gn\" (UniqueName: \"kubernetes.io/projected/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-kube-api-access-pr6gn\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920114 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920159 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920212 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-service-ca\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920242 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-trusted-ca-bundle\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920276 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-oauth-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:53:59 crc kubenswrapper[4610]: I1006 08:53:59.920340 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-oauth-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.021002 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-oauth-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.021384 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-oauth-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.021471 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6gn\" (UniqueName: \"kubernetes.io/projected/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-kube-api-access-pr6gn\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.021495 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.022151 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.022570 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-trusted-ca-bundle\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.022602 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-service-ca\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.023739 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-service-ca\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.022086 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-oauth-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.024017 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-trusted-ca-bundle\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.024219 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.030639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-oauth-config\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.030639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-console-serving-cert\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.039950 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6gn\" (UniqueName: \"kubernetes.io/projected/ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d-kube-api-access-pr6gn\") pod \"console-7546cdc554-bzqrd\" (UID: \"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d\") " pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.188766 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.276237 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k"] Oct 06 08:54:00 crc kubenswrapper[4610]: W1006 08:54:00.289250 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77dcdec2_c766_467b_a369_11ca28c22ae7.slice/crio-aa36b3afb7c9fe5d3070344669037711dd97d65592352c5fd4e8c51e5ddbbce0 WatchSource:0}: Error finding container aa36b3afb7c9fe5d3070344669037711dd97d65592352c5fd4e8c51e5ddbbce0: Status 404 returned error can't find the container with id aa36b3afb7c9fe5d3070344669037711dd97d65592352c5fd4e8c51e5ddbbce0 Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.324951 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.328678 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9590cbb8-dcf7-4c56-a984-028943b510d5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-tjvnk\" (UID: \"9590cbb8-dcf7-4c56-a984-028943b510d5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.377987 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq"] Oct 06 08:54:00 crc kubenswrapper[4610]: W1006 08:54:00.384868 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340ebace_99cf_4a2b_aaef_975b3480a795.slice/crio-a0ee87dd870eefdd67ef8916e695884a4f79efac5ef630a7a65f754964b220b1 WatchSource:0}: Error finding container a0ee87dd870eefdd67ef8916e695884a4f79efac5ef630a7a65f754964b220b1: Status 404 returned error can't find the container with id a0ee87dd870eefdd67ef8916e695884a4f79efac5ef630a7a65f754964b220b1 Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.508889 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.583464 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" event={"ID":"77dcdec2-c766-467b-a369-11ca28c22ae7","Type":"ContainerStarted","Data":"aa36b3afb7c9fe5d3070344669037711dd97d65592352c5fd4e8c51e5ddbbce0"} Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.608134 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47k9v" event={"ID":"bd9ce7eb-b1fc-4636-93fc-d007702a746f","Type":"ContainerStarted","Data":"52051bfe3f50e57e7d400a82effb61d7cbfaa77b961144af71d98806e355df19"} Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.612861 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" event={"ID":"340ebace-99cf-4a2b-aaef-975b3480a795","Type":"ContainerStarted","Data":"a0ee87dd870eefdd67ef8916e695884a4f79efac5ef630a7a65f754964b220b1"} Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.635669 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7546cdc554-bzqrd"] Oct 06 08:54:00 crc kubenswrapper[4610]: W1006 08:54:00.644092 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab067fe2_0be3_462e_bfc4_d9c6a4e40a0d.slice/crio-d0ba3309741b9d166e02c2b37a341c14658e900e90b36478f9d6051841874a9a WatchSource:0}: Error finding container d0ba3309741b9d166e02c2b37a341c14658e900e90b36478f9d6051841874a9a: Status 404 returned error can't find the container with id d0ba3309741b9d166e02c2b37a341c14658e900e90b36478f9d6051841874a9a Oct 06 08:54:00 crc kubenswrapper[4610]: I1006 08:54:00.985884 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk"] Oct 06 08:54:01 crc kubenswrapper[4610]: I1006 08:54:01.620827 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7546cdc554-bzqrd" event={"ID":"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d","Type":"ContainerStarted","Data":"e526c2f5c75c2b09defb1fb89f00f98ee9346aeab9e222f335da518c081b35ad"} Oct 06 08:54:01 crc kubenswrapper[4610]: I1006 08:54:01.621186 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7546cdc554-bzqrd" event={"ID":"ab067fe2-0be3-462e-bfc4-d9c6a4e40a0d","Type":"ContainerStarted","Data":"d0ba3309741b9d166e02c2b37a341c14658e900e90b36478f9d6051841874a9a"} Oct 06 08:54:01 crc kubenswrapper[4610]: I1006 08:54:01.622975 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" event={"ID":"9590cbb8-dcf7-4c56-a984-028943b510d5","Type":"ContainerStarted","Data":"1210699df54177d8aa12b703d65cc47b2c8699d517323adadc851218f7cdc49d"} Oct 06 08:54:01 crc kubenswrapper[4610]: I1006 08:54:01.648229 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7546cdc554-bzqrd" podStartSLOduration=2.648212823 podStartE2EDuration="2.648212823s" podCreationTimestamp="2025-10-06 08:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:54:01.646935638 +0000 UTC m=+773.361989036" watchObservedRunningTime="2025-10-06 08:54:01.648212823 +0000 UTC m=+773.363266211" Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.652889 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" event={"ID":"340ebace-99cf-4a2b-aaef-975b3480a795","Type":"ContainerStarted","Data":"d9806f81f6733ee1aabd1bd8308494d25cfcdd59a9f4b881ed3c41d66de3ecf2"} Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.654807 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" event={"ID":"9590cbb8-dcf7-4c56-a984-028943b510d5","Type":"ContainerStarted","Data":"a4ead64fab1f44774b65b523274a84cf294bced74a676e327404ed72cb396db8"} Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.655551 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.659654 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" event={"ID":"77dcdec2-c766-467b-a369-11ca28c22ae7","Type":"ContainerStarted","Data":"60b28c20676ba508cca74648a769a3caa6e32060d993a52e8cc474c2c81759b9"} Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.660832 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47k9v" event={"ID":"bd9ce7eb-b1fc-4636-93fc-d007702a746f","Type":"ContainerStarted","Data":"6c7bda56618f143d20527c03300273595ce0b18536d50e97691836c4c02473d9"} Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.661540 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.671595 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" podStartSLOduration=2.5678267569999997 podStartE2EDuration="5.671570612s" podCreationTimestamp="2025-10-06 08:53:59 +0000 UTC" firstStartedPulling="2025-10-06 08:54:00.389469287 +0000 UTC m=+772.104522675" lastFinishedPulling="2025-10-06 08:54:03.493213142 +0000 UTC m=+775.208266530" observedRunningTime="2025-10-06 08:54:04.669548146 +0000 UTC m=+776.384601564" watchObservedRunningTime="2025-10-06 08:54:04.671570612 +0000 UTC m=+776.386624040" Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.701089 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-tjvnk" podStartSLOduration=3.250267509 podStartE2EDuration="5.701071367s" podCreationTimestamp="2025-10-06 08:53:59 +0000 UTC" firstStartedPulling="2025-10-06 08:54:01.005924367 +0000 UTC m=+772.720977755" lastFinishedPulling="2025-10-06 08:54:03.456728215 +0000 UTC m=+775.171781613" observedRunningTime="2025-10-06 08:54:04.690892849 +0000 UTC m=+776.405946247" watchObservedRunningTime="2025-10-06 08:54:04.701071367 +0000 UTC m=+776.416124745" Oct 06 08:54:04 crc kubenswrapper[4610]: I1006 08:54:04.708882 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-47k9v" podStartSLOduration=2.136851144 podStartE2EDuration="5.7088634s" podCreationTimestamp="2025-10-06 08:53:59 +0000 UTC" firstStartedPulling="2025-10-06 08:53:59.883328941 +0000 UTC m=+771.598382329" lastFinishedPulling="2025-10-06 08:54:03.455341157 +0000 UTC m=+775.170394585" observedRunningTime="2025-10-06 08:54:04.708406238 +0000 UTC m=+776.423459716" watchObservedRunningTime="2025-10-06 08:54:04.7088634 +0000 UTC m=+776.423916798" Oct 06 08:54:06 crc kubenswrapper[4610]: I1006 08:54:06.671732 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" event={"ID":"77dcdec2-c766-467b-a369-11ca28c22ae7","Type":"ContainerStarted","Data":"5e1eca2b7079cb91a4893a3b13af2de7786ebcb389a005782a2522b5accd8c37"} Oct 06 08:54:06 crc kubenswrapper[4610]: I1006 08:54:06.686666 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4x98k" podStartSLOduration=2.19019226 podStartE2EDuration="7.686644387s" podCreationTimestamp="2025-10-06 08:53:59 +0000 UTC" firstStartedPulling="2025-10-06 08:54:00.291747678 +0000 UTC m=+772.006801066" lastFinishedPulling="2025-10-06 08:54:05.788199805 +0000 UTC m=+777.503253193" observedRunningTime="2025-10-06 08:54:06.684964791 +0000 UTC m=+778.400018179" watchObservedRunningTime="2025-10-06 08:54:06.686644387 +0000 UTC m=+778.401697775" Oct 06 08:54:09 crc kubenswrapper[4610]: I1006 08:54:09.828778 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-47k9v" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.190524 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.190592 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.198364 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.703467 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7546cdc554-bzqrd" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.776786 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.918484 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.919751 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:10 crc kubenswrapper[4610]: I1006 08:54:10.941281 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.096702 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.096786 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.096868 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzjn\" (UniqueName: \"kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.197723 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.197810 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzjn\" (UniqueName: \"kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.197843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.198324 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.198533 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.217684 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzjn\" (UniqueName: \"kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn\") pod \"redhat-marketplace-x2vb7\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.236672 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.656366 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:11 crc kubenswrapper[4610]: W1006 08:54:11.661610 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40ca96ad_d23d_4328_9ef8_d55d1e8c3732.slice/crio-0e95e14293365679caecd6ddb177ff9640363d415d1281ba92f98222ea57f749 WatchSource:0}: Error finding container 0e95e14293365679caecd6ddb177ff9640363d415d1281ba92f98222ea57f749: Status 404 returned error can't find the container with id 0e95e14293365679caecd6ddb177ff9640363d415d1281ba92f98222ea57f749 Oct 06 08:54:11 crc kubenswrapper[4610]: I1006 08:54:11.707168 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerStarted","Data":"0e95e14293365679caecd6ddb177ff9640363d415d1281ba92f98222ea57f749"} Oct 06 08:54:12 crc kubenswrapper[4610]: I1006 08:54:12.717410 4610 generic.go:334] "Generic (PLEG): container finished" podID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerID="11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97" exitCode=0 Oct 06 08:54:12 crc kubenswrapper[4610]: I1006 08:54:12.718249 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerDied","Data":"11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97"} Oct 06 08:54:13 crc kubenswrapper[4610]: I1006 08:54:13.726411 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerStarted","Data":"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46"} Oct 06 08:54:14 crc kubenswrapper[4610]: I1006 08:54:14.745398 4610 generic.go:334] "Generic (PLEG): container finished" podID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerID="867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46" exitCode=0 Oct 06 08:54:14 crc kubenswrapper[4610]: I1006 08:54:14.745480 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerDied","Data":"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46"} Oct 06 08:54:15 crc kubenswrapper[4610]: I1006 08:54:15.753285 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerStarted","Data":"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519"} Oct 06 08:54:15 crc kubenswrapper[4610]: I1006 08:54:15.793206 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2vb7" podStartSLOduration=3.318215311 podStartE2EDuration="5.793189983s" podCreationTimestamp="2025-10-06 08:54:10 +0000 UTC" firstStartedPulling="2025-10-06 08:54:12.721277861 +0000 UTC m=+784.436331279" lastFinishedPulling="2025-10-06 08:54:15.196252533 +0000 UTC m=+786.911305951" observedRunningTime="2025-10-06 08:54:15.791246771 +0000 UTC m=+787.506300179" watchObservedRunningTime="2025-10-06 08:54:15.793189983 +0000 UTC m=+787.508243371" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.295769 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.298649 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.317497 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468727 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468789 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468836 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468859 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468893 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.468949 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfc8f\" (UniqueName: \"kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.469325 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.469386 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6" gracePeriod=600 Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.570007 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfc8f\" (UniqueName: \"kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.570122 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.570156 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.570581 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.570826 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.592856 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfc8f\" (UniqueName: \"kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f\") pod \"community-operators-m8jk7\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.628268 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.762746 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6" exitCode=0 Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.762776 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6"} Oct 06 08:54:16 crc kubenswrapper[4610]: I1006 08:54:16.762811 4610 scope.go:117] "RemoveContainer" containerID="8d99637b22ec27b39b6feca1514d926ac2e627b47656d8b3fb7c174e49b48aec" Oct 06 08:54:17 crc kubenswrapper[4610]: I1006 08:54:17.090659 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:17 crc kubenswrapper[4610]: W1006 08:54:17.096743 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a67b124_2c68_4873_9295_569878d903bb.slice/crio-79fe7531b9e610f7d01116a6374d52adb7e7d401293208c15c820612473df1fc WatchSource:0}: Error finding container 79fe7531b9e610f7d01116a6374d52adb7e7d401293208c15c820612473df1fc: Status 404 returned error can't find the container with id 79fe7531b9e610f7d01116a6374d52adb7e7d401293208c15c820612473df1fc Oct 06 08:54:17 crc kubenswrapper[4610]: I1006 08:54:17.774225 4610 generic.go:334] "Generic (PLEG): container finished" podID="9a67b124-2c68-4873-9295-569878d903bb" containerID="2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280" exitCode=0 Oct 06 08:54:17 crc kubenswrapper[4610]: I1006 08:54:17.774355 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerDied","Data":"2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280"} Oct 06 08:54:17 crc kubenswrapper[4610]: I1006 08:54:17.774840 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerStarted","Data":"79fe7531b9e610f7d01116a6374d52adb7e7d401293208c15c820612473df1fc"} Oct 06 08:54:17 crc kubenswrapper[4610]: I1006 08:54:17.779542 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2"} Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.495072 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.501439 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.523653 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.596372 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg2d\" (UniqueName: \"kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.596759 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.596863 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.697962 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.698021 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.698088 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg2d\" (UniqueName: \"kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.698410 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.698521 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.727588 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg2d\" (UniqueName: \"kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d\") pod \"certified-operators-n6rl9\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:18 crc kubenswrapper[4610]: I1006 08:54:18.869035 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.233059 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:19 crc kubenswrapper[4610]: W1006 08:54:19.241242 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7894405c_a800_4d87_957a_fe408e20c1f0.slice/crio-784390786fc739dffd34f7c9a34fa41d0dd911676a9356f0576e111482a13a1a WatchSource:0}: Error finding container 784390786fc739dffd34f7c9a34fa41d0dd911676a9356f0576e111482a13a1a: Status 404 returned error can't find the container with id 784390786fc739dffd34f7c9a34fa41d0dd911676a9356f0576e111482a13a1a Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.789603 4610 generic.go:334] "Generic (PLEG): container finished" podID="7894405c-a800-4d87-957a-fe408e20c1f0" containerID="311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406" exitCode=0 Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.789677 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerDied","Data":"311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406"} Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.789944 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerStarted","Data":"784390786fc739dffd34f7c9a34fa41d0dd911676a9356f0576e111482a13a1a"} Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.796162 4610 generic.go:334] "Generic (PLEG): container finished" podID="9a67b124-2c68-4873-9295-569878d903bb" containerID="21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc" exitCode=0 Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.796217 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerDied","Data":"21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc"} Oct 06 08:54:19 crc kubenswrapper[4610]: I1006 08:54:19.839834 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m8xhq" Oct 06 08:54:20 crc kubenswrapper[4610]: I1006 08:54:20.823228 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerStarted","Data":"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192"} Oct 06 08:54:20 crc kubenswrapper[4610]: I1006 08:54:20.842306 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8jk7" podStartSLOduration=2.386979037 podStartE2EDuration="4.842285469s" podCreationTimestamp="2025-10-06 08:54:16 +0000 UTC" firstStartedPulling="2025-10-06 08:54:17.777222308 +0000 UTC m=+789.492275736" lastFinishedPulling="2025-10-06 08:54:20.23252878 +0000 UTC m=+791.947582168" observedRunningTime="2025-10-06 08:54:20.840542112 +0000 UTC m=+792.555595510" watchObservedRunningTime="2025-10-06 08:54:20.842285469 +0000 UTC m=+792.557338857" Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.236740 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.236793 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.288393 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.831775 4610 generic.go:334] "Generic (PLEG): container finished" podID="7894405c-a800-4d87-957a-fe408e20c1f0" containerID="26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505" exitCode=0 Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.831832 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerDied","Data":"26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505"} Oct 06 08:54:21 crc kubenswrapper[4610]: I1006 08:54:21.889163 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:23 crc kubenswrapper[4610]: I1006 08:54:23.844799 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerStarted","Data":"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4"} Oct 06 08:54:23 crc kubenswrapper[4610]: I1006 08:54:23.881632 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n6rl9" podStartSLOduration=2.9457227120000002 podStartE2EDuration="5.881612608s" podCreationTimestamp="2025-10-06 08:54:18 +0000 UTC" firstStartedPulling="2025-10-06 08:54:19.791764361 +0000 UTC m=+791.506817749" lastFinishedPulling="2025-10-06 08:54:22.727654267 +0000 UTC m=+794.442707645" observedRunningTime="2025-10-06 08:54:23.880783296 +0000 UTC m=+795.595836704" watchObservedRunningTime="2025-10-06 08:54:23.881612608 +0000 UTC m=+795.596665996" Oct 06 08:54:24 crc kubenswrapper[4610]: I1006 08:54:24.686402 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:24 crc kubenswrapper[4610]: I1006 08:54:24.686792 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2vb7" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="registry-server" containerID="cri-o://379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519" gracePeriod=2 Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.598097 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.611308 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gzjn\" (UniqueName: \"kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn\") pod \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.611350 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content\") pod \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.611384 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities\") pod \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\" (UID: \"40ca96ad-d23d-4328-9ef8-d55d1e8c3732\") " Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.612198 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities" (OuterVolumeSpecName: "utilities") pod "40ca96ad-d23d-4328-9ef8-d55d1e8c3732" (UID: "40ca96ad-d23d-4328-9ef8-d55d1e8c3732"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.626292 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn" (OuterVolumeSpecName: "kube-api-access-5gzjn") pod "40ca96ad-d23d-4328-9ef8-d55d1e8c3732" (UID: "40ca96ad-d23d-4328-9ef8-d55d1e8c3732"). InnerVolumeSpecName "kube-api-access-5gzjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.632101 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40ca96ad-d23d-4328-9ef8-d55d1e8c3732" (UID: "40ca96ad-d23d-4328-9ef8-d55d1e8c3732"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.712430 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.712469 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gzjn\" (UniqueName: \"kubernetes.io/projected/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-kube-api-access-5gzjn\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.712484 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ca96ad-d23d-4328-9ef8-d55d1e8c3732-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.855477 4610 generic.go:334] "Generic (PLEG): container finished" podID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerID="379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519" exitCode=0 Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.855518 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerDied","Data":"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519"} Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.855545 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vb7" event={"ID":"40ca96ad-d23d-4328-9ef8-d55d1e8c3732","Type":"ContainerDied","Data":"0e95e14293365679caecd6ddb177ff9640363d415d1281ba92f98222ea57f749"} Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.855561 4610 scope.go:117] "RemoveContainer" containerID="379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.855559 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vb7" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.888714 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.889077 4610 scope.go:117] "RemoveContainer" containerID="867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.893897 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vb7"] Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.903382 4610 scope.go:117] "RemoveContainer" containerID="11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.923169 4610 scope.go:117] "RemoveContainer" containerID="379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519" Oct 06 08:54:25 crc kubenswrapper[4610]: E1006 08:54:25.923681 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519\": container with ID starting with 379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519 not found: ID does not exist" containerID="379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.923735 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519"} err="failed to get container status \"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519\": rpc error: code = NotFound desc = could not find container \"379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519\": container with ID starting with 379945b3ef47d3d842636c7fa98669cc12bc675a220a6700493f9e79b0244519 not found: ID does not exist" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.923775 4610 scope.go:117] "RemoveContainer" containerID="867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46" Oct 06 08:54:25 crc kubenswrapper[4610]: E1006 08:54:25.924315 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46\": container with ID starting with 867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46 not found: ID does not exist" containerID="867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.924343 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46"} err="failed to get container status \"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46\": rpc error: code = NotFound desc = could not find container \"867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46\": container with ID starting with 867b69be41668d194de6ec6e90638b7aae3c13befc1d09aa27b3934db30b0a46 not found: ID does not exist" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.924490 4610 scope.go:117] "RemoveContainer" containerID="11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97" Oct 06 08:54:25 crc kubenswrapper[4610]: E1006 08:54:25.924973 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97\": container with ID starting with 11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97 not found: ID does not exist" containerID="11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97" Oct 06 08:54:25 crc kubenswrapper[4610]: I1006 08:54:25.925023 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97"} err="failed to get container status \"11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97\": rpc error: code = NotFound desc = could not find container \"11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97\": container with ID starting with 11a475c2e9780b2795f63f564d068972995c7df0efd240cbd4c0b82aebba2e97 not found: ID does not exist" Oct 06 08:54:26 crc kubenswrapper[4610]: I1006 08:54:26.628527 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:26 crc kubenswrapper[4610]: I1006 08:54:26.628893 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:26 crc kubenswrapper[4610]: I1006 08:54:26.675166 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:26 crc kubenswrapper[4610]: I1006 08:54:26.932353 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:27 crc kubenswrapper[4610]: I1006 08:54:27.081149 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" path="/var/lib/kubelet/pods/40ca96ad-d23d-4328-9ef8-d55d1e8c3732/volumes" Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.870640 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.871202 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.893723 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.893962 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8jk7" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="registry-server" containerID="cri-o://8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192" gracePeriod=2 Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.935231 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:28 crc kubenswrapper[4610]: I1006 08:54:28.984942 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.257149 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.276308 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content\") pod \"9a67b124-2c68-4873-9295-569878d903bb\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.276428 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities\") pod \"9a67b124-2c68-4873-9295-569878d903bb\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.276718 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfc8f\" (UniqueName: \"kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f\") pod \"9a67b124-2c68-4873-9295-569878d903bb\" (UID: \"9a67b124-2c68-4873-9295-569878d903bb\") " Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.278942 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities" (OuterVolumeSpecName: "utilities") pod "9a67b124-2c68-4873-9295-569878d903bb" (UID: "9a67b124-2c68-4873-9295-569878d903bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.310094 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f" (OuterVolumeSpecName: "kube-api-access-jfc8f") pod "9a67b124-2c68-4873-9295-569878d903bb" (UID: "9a67b124-2c68-4873-9295-569878d903bb"). InnerVolumeSpecName "kube-api-access-jfc8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.357984 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a67b124-2c68-4873-9295-569878d903bb" (UID: "9a67b124-2c68-4873-9295-569878d903bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.379181 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.379208 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a67b124-2c68-4873-9295-569878d903bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.379225 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfc8f\" (UniqueName: \"kubernetes.io/projected/9a67b124-2c68-4873-9295-569878d903bb-kube-api-access-jfc8f\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.901408 4610 generic.go:334] "Generic (PLEG): container finished" podID="9a67b124-2c68-4873-9295-569878d903bb" containerID="8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192" exitCode=0 Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.901508 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8jk7" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.901548 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerDied","Data":"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192"} Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.901612 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8jk7" event={"ID":"9a67b124-2c68-4873-9295-569878d903bb","Type":"ContainerDied","Data":"79fe7531b9e610f7d01116a6374d52adb7e7d401293208c15c820612473df1fc"} Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.901945 4610 scope.go:117] "RemoveContainer" containerID="8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.931791 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.932248 4610 scope.go:117] "RemoveContainer" containerID="21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.936074 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m8jk7"] Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.949362 4610 scope.go:117] "RemoveContainer" containerID="2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.971637 4610 scope.go:117] "RemoveContainer" containerID="8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192" Oct 06 08:54:29 crc kubenswrapper[4610]: E1006 08:54:29.972252 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192\": container with ID starting with 8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192 not found: ID does not exist" containerID="8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.972289 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192"} err="failed to get container status \"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192\": rpc error: code = NotFound desc = could not find container \"8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192\": container with ID starting with 8c778fe4aa381b525e76b7791759124485291165d05b7dc30cb66060cd897192 not found: ID does not exist" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.972314 4610 scope.go:117] "RemoveContainer" containerID="21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc" Oct 06 08:54:29 crc kubenswrapper[4610]: E1006 08:54:29.972641 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc\": container with ID starting with 21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc not found: ID does not exist" containerID="21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.972668 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc"} err="failed to get container status \"21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc\": rpc error: code = NotFound desc = could not find container \"21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc\": container with ID starting with 21ebd0d7b1d661ecce399e30d01c0d4304429ef3afbafd881ec358717431b0cc not found: ID does not exist" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.972690 4610 scope.go:117] "RemoveContainer" containerID="2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280" Oct 06 08:54:29 crc kubenswrapper[4610]: E1006 08:54:29.972936 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280\": container with ID starting with 2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280 not found: ID does not exist" containerID="2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280" Oct 06 08:54:29 crc kubenswrapper[4610]: I1006 08:54:29.972969 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280"} err="failed to get container status \"2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280\": rpc error: code = NotFound desc = could not find container \"2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280\": container with ID starting with 2210f2467b6d64e54e0fd1a9baa59a13297267f0e7cc24d34dc23e409f76a280 not found: ID does not exist" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.081027 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a67b124-2c68-4873-9295-569878d903bb" path="/var/lib/kubelet/pods/9a67b124-2c68-4873-9295-569878d903bb/volumes" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.284884 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.285150 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n6rl9" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="registry-server" containerID="cri-o://5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4" gracePeriod=2 Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.692831 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.708960 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities\") pod \"7894405c-a800-4d87-957a-fe408e20c1f0\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.709007 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqg2d\" (UniqueName: \"kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d\") pod \"7894405c-a800-4d87-957a-fe408e20c1f0\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.709127 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content\") pod \"7894405c-a800-4d87-957a-fe408e20c1f0\" (UID: \"7894405c-a800-4d87-957a-fe408e20c1f0\") " Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.711907 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities" (OuterVolumeSpecName: "utilities") pod "7894405c-a800-4d87-957a-fe408e20c1f0" (UID: "7894405c-a800-4d87-957a-fe408e20c1f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.718499 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d" (OuterVolumeSpecName: "kube-api-access-rqg2d") pod "7894405c-a800-4d87-957a-fe408e20c1f0" (UID: "7894405c-a800-4d87-957a-fe408e20c1f0"). InnerVolumeSpecName "kube-api-access-rqg2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.766900 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7894405c-a800-4d87-957a-fe408e20c1f0" (UID: "7894405c-a800-4d87-957a-fe408e20c1f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.810179 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.810213 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7894405c-a800-4d87-957a-fe408e20c1f0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.810223 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqg2d\" (UniqueName: \"kubernetes.io/projected/7894405c-a800-4d87-957a-fe408e20c1f0-kube-api-access-rqg2d\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.915168 4610 generic.go:334] "Generic (PLEG): container finished" podID="7894405c-a800-4d87-957a-fe408e20c1f0" containerID="5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4" exitCode=0 Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.915252 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerDied","Data":"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4"} Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.915578 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6rl9" event={"ID":"7894405c-a800-4d87-957a-fe408e20c1f0","Type":"ContainerDied","Data":"784390786fc739dffd34f7c9a34fa41d0dd911676a9356f0576e111482a13a1a"} Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.915609 4610 scope.go:117] "RemoveContainer" containerID="5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.915296 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6rl9" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.933910 4610 scope.go:117] "RemoveContainer" containerID="26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.943396 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.948505 4610 scope.go:117] "RemoveContainer" containerID="311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.951992 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n6rl9"] Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.963695 4610 scope.go:117] "RemoveContainer" containerID="5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4" Oct 06 08:54:31 crc kubenswrapper[4610]: E1006 08:54:31.964234 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4\": container with ID starting with 5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4 not found: ID does not exist" containerID="5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.964280 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4"} err="failed to get container status \"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4\": rpc error: code = NotFound desc = could not find container \"5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4\": container with ID starting with 5d37a8f6b9bd133fcb02003f3135c996aaa9b8803efc204d2a149204d40aa6e4 not found: ID does not exist" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.964303 4610 scope.go:117] "RemoveContainer" containerID="26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505" Oct 06 08:54:31 crc kubenswrapper[4610]: E1006 08:54:31.964627 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505\": container with ID starting with 26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505 not found: ID does not exist" containerID="26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.964671 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505"} err="failed to get container status \"26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505\": rpc error: code = NotFound desc = could not find container \"26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505\": container with ID starting with 26c1f9b4f60c87025131711fd8f3679e7457e6b9e0deb0e066fb757cc0979505 not found: ID does not exist" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.964705 4610 scope.go:117] "RemoveContainer" containerID="311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406" Oct 06 08:54:31 crc kubenswrapper[4610]: E1006 08:54:31.964988 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406\": container with ID starting with 311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406 not found: ID does not exist" containerID="311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406" Oct 06 08:54:31 crc kubenswrapper[4610]: I1006 08:54:31.965024 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406"} err="failed to get container status \"311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406\": rpc error: code = NotFound desc = could not find container \"311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406\": container with ID starting with 311c06c2ed381557c30c941529482e1f5f5bc68fdc03e5dadc3b219bd461b406 not found: ID does not exist" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.077271 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" path="/var/lib/kubelet/pods/7894405c-a800-4d87-957a-fe408e20c1f0/volumes" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.558684 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk"] Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559115 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559147 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559173 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559186 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559210 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559224 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559240 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559251 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559264 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559275 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559293 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559306 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="extract-utilities" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559326 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559338 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559356 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559369 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="extract-content" Oct 06 08:54:33 crc kubenswrapper[4610]: E1006 08:54:33.559393 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559409 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559610 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7894405c-a800-4d87-957a-fe408e20c1f0" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559638 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ca96ad-d23d-4328-9ef8-d55d1e8c3732" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.559657 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a67b124-2c68-4873-9295-569878d903bb" containerName="registry-server" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.561178 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.563844 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.586128 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk"] Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.632485 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.632564 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.632603 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qgm\" (UniqueName: \"kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.734521 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.734603 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.734629 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qgm\" (UniqueName: \"kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.735101 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.735333 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.755756 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qgm\" (UniqueName: \"kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:33 crc kubenswrapper[4610]: I1006 08:54:33.889988 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:34 crc kubenswrapper[4610]: I1006 08:54:34.290565 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk"] Oct 06 08:54:34 crc kubenswrapper[4610]: W1006 08:54:34.300184 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1e51cb_d7f6_4b8b_8c1c_46c166179994.slice/crio-d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2 WatchSource:0}: Error finding container d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2: Status 404 returned error can't find the container with id d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2 Oct 06 08:54:34 crc kubenswrapper[4610]: I1006 08:54:34.949999 4610 generic.go:334] "Generic (PLEG): container finished" podID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerID="615eda7132f4098f5dc9d3b66ef7c26b1b52f8289ef68b6ae04ebf710835845d" exitCode=0 Oct 06 08:54:34 crc kubenswrapper[4610]: I1006 08:54:34.950051 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" event={"ID":"9e1e51cb-d7f6-4b8b-8c1c-46c166179994","Type":"ContainerDied","Data":"615eda7132f4098f5dc9d3b66ef7c26b1b52f8289ef68b6ae04ebf710835845d"} Oct 06 08:54:34 crc kubenswrapper[4610]: I1006 08:54:34.950333 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" event={"ID":"9e1e51cb-d7f6-4b8b-8c1c-46c166179994","Type":"ContainerStarted","Data":"d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2"} Oct 06 08:54:35 crc kubenswrapper[4610]: I1006 08:54:35.837827 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8p28v" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" containerID="cri-o://f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f" gracePeriod=15 Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.193834 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8p28v_75726254-6806-4c39-a565-f48ca0eb4fd3/console/0.log" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.194158 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265322 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265351 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265368 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265396 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265944 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.265998 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.266121 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert\") pod \"75726254-6806-4c39-a565-f48ca0eb4fd3\" (UID: \"75726254-6806-4c39-a565-f48ca0eb4fd3\") " Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.266839 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.267143 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca" (OuterVolumeSpecName: "service-ca") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.267259 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.268109 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config" (OuterVolumeSpecName: "console-config") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.271354 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.271518 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq" (OuterVolumeSpecName: "kube-api-access-jpwrq") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "kube-api-access-jpwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.272338 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "75726254-6806-4c39-a565-f48ca0eb4fd3" (UID: "75726254-6806-4c39-a565-f48ca0eb4fd3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368129 4610 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368165 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpwrq\" (UniqueName: \"kubernetes.io/projected/75726254-6806-4c39-a565-f48ca0eb4fd3-kube-api-access-jpwrq\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368180 4610 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368193 4610 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368204 4610 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368214 4610 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75726254-6806-4c39-a565-f48ca0eb4fd3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.368225 4610 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75726254-6806-4c39-a565-f48ca0eb4fd3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.976768 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8p28v_75726254-6806-4c39-a565-f48ca0eb4fd3/console/0.log" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.977037 4610 generic.go:334] "Generic (PLEG): container finished" podID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerID="f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f" exitCode=2 Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.977175 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8p28v" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.977265 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8p28v" event={"ID":"75726254-6806-4c39-a565-f48ca0eb4fd3","Type":"ContainerDied","Data":"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f"} Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.977300 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8p28v" event={"ID":"75726254-6806-4c39-a565-f48ca0eb4fd3","Type":"ContainerDied","Data":"39906fcb5c32e7d40b57b13cefc2c99d076496deb67a168b541e6760db06a235"} Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.977321 4610 scope.go:117] "RemoveContainer" containerID="f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f" Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.986585 4610 generic.go:334] "Generic (PLEG): container finished" podID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerID="56c49ca90a9d211c3fbc4f42b742e42933fc0ed6305b30f0bbe69c04b48a1500" exitCode=0 Oct 06 08:54:36 crc kubenswrapper[4610]: I1006 08:54:36.986647 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" event={"ID":"9e1e51cb-d7f6-4b8b-8c1c-46c166179994","Type":"ContainerDied","Data":"56c49ca90a9d211c3fbc4f42b742e42933fc0ed6305b30f0bbe69c04b48a1500"} Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.006112 4610 scope.go:117] "RemoveContainer" containerID="f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f" Oct 06 08:54:37 crc kubenswrapper[4610]: E1006 08:54:37.007797 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f\": container with ID starting with f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f not found: ID does not exist" containerID="f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f" Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.007892 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f"} err="failed to get container status \"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f\": rpc error: code = NotFound desc = could not find container \"f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f\": container with ID starting with f9aa7021661b5a2a02d0cdd228611e45e137e033a4ced7a4ce1be652ddb7d63f not found: ID does not exist" Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.041035 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.041117 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8p28v"] Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.078301 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" path="/var/lib/kubelet/pods/75726254-6806-4c39-a565-f48ca0eb4fd3/volumes" Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.998857 4610 generic.go:334] "Generic (PLEG): container finished" podID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerID="5490a1c064ffeed9df34aa98c79f12ca04ec54ef446fa280e690eea30281f7a3" exitCode=0 Oct 06 08:54:37 crc kubenswrapper[4610]: I1006 08:54:37.998929 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" event={"ID":"9e1e51cb-d7f6-4b8b-8c1c-46c166179994","Type":"ContainerDied","Data":"5490a1c064ffeed9df34aa98c79f12ca04ec54ef446fa280e690eea30281f7a3"} Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.315009 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.504974 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util\") pod \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.505035 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8qgm\" (UniqueName: \"kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm\") pod \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.505121 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle\") pod \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\" (UID: \"9e1e51cb-d7f6-4b8b-8c1c-46c166179994\") " Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.506323 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle" (OuterVolumeSpecName: "bundle") pod "9e1e51cb-d7f6-4b8b-8c1c-46c166179994" (UID: "9e1e51cb-d7f6-4b8b-8c1c-46c166179994"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.510362 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm" (OuterVolumeSpecName: "kube-api-access-p8qgm") pod "9e1e51cb-d7f6-4b8b-8c1c-46c166179994" (UID: "9e1e51cb-d7f6-4b8b-8c1c-46c166179994"). InnerVolumeSpecName "kube-api-access-p8qgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.520357 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util" (OuterVolumeSpecName: "util") pod "9e1e51cb-d7f6-4b8b-8c1c-46c166179994" (UID: "9e1e51cb-d7f6-4b8b-8c1c-46c166179994"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.606905 4610 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.606944 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8qgm\" (UniqueName: \"kubernetes.io/projected/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-kube-api-access-p8qgm\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:39 crc kubenswrapper[4610]: I1006 08:54:39.606958 4610 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e1e51cb-d7f6-4b8b-8c1c-46c166179994-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:54:40 crc kubenswrapper[4610]: I1006 08:54:40.014282 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" event={"ID":"9e1e51cb-d7f6-4b8b-8c1c-46c166179994","Type":"ContainerDied","Data":"d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2"} Oct 06 08:54:40 crc kubenswrapper[4610]: I1006 08:54:40.014332 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50137ae388dcf671e499e2275f98c914a85d88abc66000de1fe7e0b1f2d02a2" Oct 06 08:54:40 crc kubenswrapper[4610]: I1006 08:54:40.014355 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.563331 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv"] Oct 06 08:54:47 crc kubenswrapper[4610]: E1006 08:54:47.564136 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="util" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564152 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="util" Oct 06 08:54:47 crc kubenswrapper[4610]: E1006 08:54:47.564173 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="extract" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564180 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="extract" Oct 06 08:54:47 crc kubenswrapper[4610]: E1006 08:54:47.564191 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="pull" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564198 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="pull" Oct 06 08:54:47 crc kubenswrapper[4610]: E1006 08:54:47.564207 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564213 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564344 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="75726254-6806-4c39-a565-f48ca0eb4fd3" containerName="console" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564355 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e1e51cb-d7f6-4b8b-8c1c-46c166179994" containerName="extract" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.564820 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.567512 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.567539 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.567661 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.568012 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.568212 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ppjnr" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.597868 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv"] Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.602626 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-apiservice-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.602680 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-webhook-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.602704 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbkt\" (UniqueName: \"kubernetes.io/projected/4864fd4e-baeb-4b35-ae9b-b41f43515efd-kube-api-access-fzbkt\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.703975 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-webhook-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.704028 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbkt\" (UniqueName: \"kubernetes.io/projected/4864fd4e-baeb-4b35-ae9b-b41f43515efd-kube-api-access-fzbkt\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.704131 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-apiservice-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.710853 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-webhook-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.714340 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4864fd4e-baeb-4b35-ae9b-b41f43515efd-apiservice-cert\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.735067 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbkt\" (UniqueName: \"kubernetes.io/projected/4864fd4e-baeb-4b35-ae9b-b41f43515efd-kube-api-access-fzbkt\") pod \"metallb-operator-controller-manager-78b8b54fdd-fwfzv\" (UID: \"4864fd4e-baeb-4b35-ae9b-b41f43515efd\") " pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.825262 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg"] Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.825912 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.827920 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.829724 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.832423 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xbx9j" Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.842529 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg"] Oct 06 08:54:47 crc kubenswrapper[4610]: I1006 08:54:47.882517 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.007509 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-apiservice-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.007572 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-webhook-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.007644 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swdk8\" (UniqueName: \"kubernetes.io/projected/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-kube-api-access-swdk8\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.109814 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swdk8\" (UniqueName: \"kubernetes.io/projected/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-kube-api-access-swdk8\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.110261 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-apiservice-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.110281 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-webhook-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.154895 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swdk8\" (UniqueName: \"kubernetes.io/projected/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-kube-api-access-swdk8\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.156758 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-apiservice-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.163637 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2f355c4-bea3-46ef-b5bf-d7393c884ac1-webhook-cert\") pod \"metallb-operator-webhook-server-6b496585dd-ndrsg\" (UID: \"f2f355c4-bea3-46ef-b5bf-d7393c884ac1\") " pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.203101 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv"] Oct 06 08:54:48 crc kubenswrapper[4610]: W1006 08:54:48.216204 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4864fd4e_baeb_4b35_ae9b_b41f43515efd.slice/crio-e2cd988ca56d9eb4bed575edea581de70dba09f4b68383e53179cc8081b3cf64 WatchSource:0}: Error finding container e2cd988ca56d9eb4bed575edea581de70dba09f4b68383e53179cc8081b3cf64: Status 404 returned error can't find the container with id e2cd988ca56d9eb4bed575edea581de70dba09f4b68383e53179cc8081b3cf64 Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.439369 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:48 crc kubenswrapper[4610]: I1006 08:54:48.709866 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg"] Oct 06 08:54:48 crc kubenswrapper[4610]: W1006 08:54:48.717207 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2f355c4_bea3_46ef_b5bf_d7393c884ac1.slice/crio-8257328a9a9557a4a47afebf99cd103c19b57c5ab427854dc61d803a3f1cb7c9 WatchSource:0}: Error finding container 8257328a9a9557a4a47afebf99cd103c19b57c5ab427854dc61d803a3f1cb7c9: Status 404 returned error can't find the container with id 8257328a9a9557a4a47afebf99cd103c19b57c5ab427854dc61d803a3f1cb7c9 Oct 06 08:54:49 crc kubenswrapper[4610]: I1006 08:54:49.079809 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" event={"ID":"f2f355c4-bea3-46ef-b5bf-d7393c884ac1","Type":"ContainerStarted","Data":"8257328a9a9557a4a47afebf99cd103c19b57c5ab427854dc61d803a3f1cb7c9"} Oct 06 08:54:49 crc kubenswrapper[4610]: I1006 08:54:49.079845 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" event={"ID":"4864fd4e-baeb-4b35-ae9b-b41f43515efd","Type":"ContainerStarted","Data":"e2cd988ca56d9eb4bed575edea581de70dba09f4b68383e53179cc8081b3cf64"} Oct 06 08:54:52 crc kubenswrapper[4610]: I1006 08:54:52.101380 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" event={"ID":"4864fd4e-baeb-4b35-ae9b-b41f43515efd","Type":"ContainerStarted","Data":"8dc0d3e947ef8abf80f615ad860a8722c292274c20199e7ae1d4ede4528fe293"} Oct 06 08:54:52 crc kubenswrapper[4610]: I1006 08:54:52.101771 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:54:52 crc kubenswrapper[4610]: I1006 08:54:52.145693 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" podStartSLOduration=1.505041957 podStartE2EDuration="5.145672932s" podCreationTimestamp="2025-10-06 08:54:47 +0000 UTC" firstStartedPulling="2025-10-06 08:54:48.217536163 +0000 UTC m=+819.932589551" lastFinishedPulling="2025-10-06 08:54:51.858167138 +0000 UTC m=+823.573220526" observedRunningTime="2025-10-06 08:54:52.141275356 +0000 UTC m=+823.856328744" watchObservedRunningTime="2025-10-06 08:54:52.145672932 +0000 UTC m=+823.860726330" Oct 06 08:54:55 crc kubenswrapper[4610]: I1006 08:54:55.123117 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" event={"ID":"f2f355c4-bea3-46ef-b5bf-d7393c884ac1","Type":"ContainerStarted","Data":"d03a0b806662833517dfc40b031f706ee1404e3914062788fd8597e36fc6e668"} Oct 06 08:54:55 crc kubenswrapper[4610]: I1006 08:54:55.123718 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:54:55 crc kubenswrapper[4610]: I1006 08:54:55.142198 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" podStartSLOduration=2.792389516 podStartE2EDuration="8.142182275s" podCreationTimestamp="2025-10-06 08:54:47 +0000 UTC" firstStartedPulling="2025-10-06 08:54:48.718529179 +0000 UTC m=+820.433582577" lastFinishedPulling="2025-10-06 08:54:54.068321948 +0000 UTC m=+825.783375336" observedRunningTime="2025-10-06 08:54:55.139309149 +0000 UTC m=+826.854362537" watchObservedRunningTime="2025-10-06 08:54:55.142182275 +0000 UTC m=+826.857235673" Oct 06 08:55:08 crc kubenswrapper[4610]: I1006 08:55:08.447543 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b496585dd-ndrsg" Oct 06 08:55:27 crc kubenswrapper[4610]: I1006 08:55:27.884962 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78b8b54fdd-fwfzv" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.684257 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.685207 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.687589 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jpx64" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.690092 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rmk9w"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.698261 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.702048 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.702279 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.710241 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.745768 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.809839 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.809899 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9hv\" (UniqueName: \"kubernetes.io/projected/e0af20a6-573f-4421-b9a4-5d5005a855b8-kube-api-access-pq9hv\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.809931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vwf\" (UniqueName: \"kubernetes.io/projected/0f88e64c-929a-4a97-a3a1-a92face17060-kube-api-access-99vwf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.809983 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f88e64c-929a-4a97-a3a1-a92face17060-frr-startup\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.810010 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-metrics\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.810030 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-reloader\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.810059 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-sockets\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.810101 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.810126 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-conf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.833031 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nssl6"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.834016 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-855xl"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.834838 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.835394 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nssl6" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.840621 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bs2sv" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.840829 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.841234 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.842227 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.842439 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.843524 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-855xl"] Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911671 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-metrics\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911728 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldw2\" (UniqueName: \"kubernetes.io/projected/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-kube-api-access-qldw2\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911758 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-reloader\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911783 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-sockets\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911807 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0585866f-da3e-4ab2-83a7-e0819349eb4d-metallb-excludel2\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911830 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911851 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911879 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-conf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911909 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrwz\" (UniqueName: \"kubernetes.io/projected/0585866f-da3e-4ab2-83a7-e0819349eb4d-kube-api-access-bbrwz\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911937 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911968 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9hv\" (UniqueName: \"kubernetes.io/projected/e0af20a6-573f-4421-b9a4-5d5005a855b8-kube-api-access-pq9hv\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.911989 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.912016 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vwf\" (UniqueName: \"kubernetes.io/projected/0f88e64c-929a-4a97-a3a1-a92face17060-kube-api-access-99vwf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.912048 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.912470 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-metrics\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.912714 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-reloader\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.912893 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-sockets\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: E1006 08:55:28.912971 4610 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 06 08:55:28 crc kubenswrapper[4610]: E1006 08:55:28.913013 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert podName:e0af20a6-573f-4421-b9a4-5d5005a855b8 nodeName:}" failed. No retries permitted until 2025-10-06 08:55:29.412997862 +0000 UTC m=+861.128051250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert") pod "frr-k8s-webhook-server-64bf5d555-6pnsd" (UID: "e0af20a6-573f-4421-b9a4-5d5005a855b8") : secret "frr-k8s-webhook-server-cert" not found Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.913453 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f88e64c-929a-4a97-a3a1-a92face17060-frr-conf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: E1006 08:55:28.913528 4610 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 06 08:55:28 crc kubenswrapper[4610]: E1006 08:55:28.913550 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs podName:0f88e64c-929a-4a97-a3a1-a92face17060 nodeName:}" failed. No retries permitted until 2025-10-06 08:55:29.413543237 +0000 UTC m=+861.128596615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs") pod "frr-k8s-rmk9w" (UID: "0f88e64c-929a-4a97-a3a1-a92face17060") : secret "frr-k8s-certs-secret" not found Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.913705 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-cert\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.913743 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f88e64c-929a-4a97-a3a1-a92face17060-frr-startup\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.914507 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f88e64c-929a-4a97-a3a1-a92face17060-frr-startup\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.939917 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9hv\" (UniqueName: \"kubernetes.io/projected/e0af20a6-573f-4421-b9a4-5d5005a855b8-kube-api-access-pq9hv\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:28 crc kubenswrapper[4610]: I1006 08:55:28.947520 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vwf\" (UniqueName: \"kubernetes.io/projected/0f88e64c-929a-4a97-a3a1-a92face17060-kube-api-access-99vwf\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.014960 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-cert\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015011 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldw2\" (UniqueName: \"kubernetes.io/projected/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-kube-api-access-qldw2\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015039 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0585866f-da3e-4ab2-83a7-e0819349eb4d-metallb-excludel2\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015079 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrwz\" (UniqueName: \"kubernetes.io/projected/0585866f-da3e-4ab2-83a7-e0819349eb4d-kube-api-access-bbrwz\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015145 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015168 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015272 4610 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015319 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs podName:ddd30bb0-f54e-4aa2-81c2-f27b83aaf443 nodeName:}" failed. No retries permitted until 2025-10-06 08:55:29.515305086 +0000 UTC m=+861.230358474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs") pod "controller-68d546b9d8-855xl" (UID: "ddd30bb0-f54e-4aa2-81c2-f27b83aaf443") : secret "controller-certs-secret" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015497 4610 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015538 4610 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015546 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist podName:0585866f-da3e-4ab2-83a7-e0819349eb4d nodeName:}" failed. No retries permitted until 2025-10-06 08:55:29.515530932 +0000 UTC m=+861.230584320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist") pod "speaker-nssl6" (UID: "0585866f-da3e-4ab2-83a7-e0819349eb4d") : secret "metallb-memberlist" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.015561 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs podName:0585866f-da3e-4ab2-83a7-e0819349eb4d nodeName:}" failed. No retries permitted until 2025-10-06 08:55:29.515554902 +0000 UTC m=+861.230608290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs") pod "speaker-nssl6" (UID: "0585866f-da3e-4ab2-83a7-e0819349eb4d") : secret "speaker-certs-secret" not found Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.015826 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0585866f-da3e-4ab2-83a7-e0819349eb4d-metallb-excludel2\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.018158 4610 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.042690 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-cert\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.046836 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrwz\" (UniqueName: \"kubernetes.io/projected/0585866f-da3e-4ab2-83a7-e0819349eb4d-kube-api-access-bbrwz\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.053690 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldw2\" (UniqueName: \"kubernetes.io/projected/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-kube-api-access-qldw2\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.419346 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.419429 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.419787 4610 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.419886 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert podName:e0af20a6-573f-4421-b9a4-5d5005a855b8 nodeName:}" failed. No retries permitted until 2025-10-06 08:55:30.419865914 +0000 UTC m=+862.134919312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert") pod "frr-k8s-webhook-server-64bf5d555-6pnsd" (UID: "e0af20a6-573f-4421-b9a4-5d5005a855b8") : secret "frr-k8s-webhook-server-cert" not found Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.424594 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f88e64c-929a-4a97-a3a1-a92face17060-metrics-certs\") pod \"frr-k8s-rmk9w\" (UID: \"0f88e64c-929a-4a97-a3a1-a92face17060\") " pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.520142 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.520211 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.520270 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.520382 4610 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:55:29 crc kubenswrapper[4610]: E1006 08:55:29.520432 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist podName:0585866f-da3e-4ab2-83a7-e0819349eb4d nodeName:}" failed. No retries permitted until 2025-10-06 08:55:30.520418741 +0000 UTC m=+862.235472129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist") pod "speaker-nssl6" (UID: "0585866f-da3e-4ab2-83a7-e0819349eb4d") : secret "metallb-memberlist" not found Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.525143 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd30bb0-f54e-4aa2-81c2-f27b83aaf443-metrics-certs\") pod \"controller-68d546b9d8-855xl\" (UID: \"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443\") " pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.525956 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-metrics-certs\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.662615 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:29 crc kubenswrapper[4610]: I1006 08:55:29.773397 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.213262 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-855xl"] Oct 06 08:55:30 crc kubenswrapper[4610]: W1006 08:55:30.220285 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd30bb0_f54e_4aa2_81c2_f27b83aaf443.slice/crio-5ff7312127f62c6048ccc4d354aa9acd97fd98c074ea2e0fec06522efc1c8feb WatchSource:0}: Error finding container 5ff7312127f62c6048ccc4d354aa9acd97fd98c074ea2e0fec06522efc1c8feb: Status 404 returned error can't find the container with id 5ff7312127f62c6048ccc4d354aa9acd97fd98c074ea2e0fec06522efc1c8feb Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.307267 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"ee5c0bca1637a5a375058b62c9107ae08d6fca5e302350026e0f7137878df87d"} Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.308246 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-855xl" event={"ID":"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443","Type":"ContainerStarted","Data":"5ff7312127f62c6048ccc4d354aa9acd97fd98c074ea2e0fec06522efc1c8feb"} Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.437706 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.443347 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0af20a6-573f-4421-b9a4-5d5005a855b8-cert\") pod \"frr-k8s-webhook-server-64bf5d555-6pnsd\" (UID: \"e0af20a6-573f-4421-b9a4-5d5005a855b8\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.538986 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.541978 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0585866f-da3e-4ab2-83a7-e0819349eb4d-memberlist\") pod \"speaker-nssl6\" (UID: \"0585866f-da3e-4ab2-83a7-e0819349eb4d\") " pod="metallb-system/speaker-nssl6" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.553265 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.680823 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nssl6" Oct 06 08:55:30 crc kubenswrapper[4610]: W1006 08:55:30.717329 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0585866f_da3e_4ab2_83a7_e0819349eb4d.slice/crio-7540723955ba56212c683b95f84c9512eabe869bf61f155214851d97427a592f WatchSource:0}: Error finding container 7540723955ba56212c683b95f84c9512eabe869bf61f155214851d97427a592f: Status 404 returned error can't find the container with id 7540723955ba56212c683b95f84c9512eabe869bf61f155214851d97427a592f Oct 06 08:55:30 crc kubenswrapper[4610]: I1006 08:55:30.983314 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd"] Oct 06 08:55:31 crc kubenswrapper[4610]: W1006 08:55:31.008466 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0af20a6_573f_4421_b9a4_5d5005a855b8.slice/crio-bb7e6bd0e632cbd5b617432032a01d8c097579f36a72622f5ce5df54326c5475 WatchSource:0}: Error finding container bb7e6bd0e632cbd5b617432032a01d8c097579f36a72622f5ce5df54326c5475: Status 404 returned error can't find the container with id bb7e6bd0e632cbd5b617432032a01d8c097579f36a72622f5ce5df54326c5475 Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.315622 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-855xl" event={"ID":"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443","Type":"ContainerStarted","Data":"21695de0bdf0f88d2f23d6805a17dff9d044134fd9e10d0104342c95e2772de9"} Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.315851 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.315861 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-855xl" event={"ID":"ddd30bb0-f54e-4aa2-81c2-f27b83aaf443","Type":"ContainerStarted","Data":"7f39ba957ebf14a01c8b660f94c818b657bc0614d9c7accb9c3c2ea375be0234"} Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.329486 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nssl6" event={"ID":"0585866f-da3e-4ab2-83a7-e0819349eb4d","Type":"ContainerStarted","Data":"e67e6c40c2f67428f076d0d6f29f3275b9755ead9621b528e7d8619e1b2ee1a5"} Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.329530 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nssl6" event={"ID":"0585866f-da3e-4ab2-83a7-e0819349eb4d","Type":"ContainerStarted","Data":"7540723955ba56212c683b95f84c9512eabe869bf61f155214851d97427a592f"} Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.337772 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" event={"ID":"e0af20a6-573f-4421-b9a4-5d5005a855b8","Type":"ContainerStarted","Data":"bb7e6bd0e632cbd5b617432032a01d8c097579f36a72622f5ce5df54326c5475"} Oct 06 08:55:31 crc kubenswrapper[4610]: I1006 08:55:31.351589 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-855xl" podStartSLOduration=3.351569221 podStartE2EDuration="3.351569221s" podCreationTimestamp="2025-10-06 08:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:55:31.34739667 +0000 UTC m=+863.062450058" watchObservedRunningTime="2025-10-06 08:55:31.351569221 +0000 UTC m=+863.066622609" Oct 06 08:55:32 crc kubenswrapper[4610]: I1006 08:55:32.355964 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nssl6" event={"ID":"0585866f-da3e-4ab2-83a7-e0819349eb4d","Type":"ContainerStarted","Data":"d50f475dcebdb850529020bfcb7784a2252d942378e0242c5fb959cd6e4f9d7d"} Oct 06 08:55:32 crc kubenswrapper[4610]: I1006 08:55:32.356044 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nssl6" Oct 06 08:55:32 crc kubenswrapper[4610]: I1006 08:55:32.407497 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nssl6" podStartSLOduration=4.407481521 podStartE2EDuration="4.407481521s" podCreationTimestamp="2025-10-06 08:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:55:32.403871535 +0000 UTC m=+864.118924923" watchObservedRunningTime="2025-10-06 08:55:32.407481521 +0000 UTC m=+864.122534909" Oct 06 08:55:38 crc kubenswrapper[4610]: I1006 08:55:38.414478 4610 generic.go:334] "Generic (PLEG): container finished" podID="0f88e64c-929a-4a97-a3a1-a92face17060" containerID="ee6154bf1da5580ede789d3a87d6b67c052c8a5a2030f12d51c71c02db511255" exitCode=0 Oct 06 08:55:38 crc kubenswrapper[4610]: I1006 08:55:38.414630 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerDied","Data":"ee6154bf1da5580ede789d3a87d6b67c052c8a5a2030f12d51c71c02db511255"} Oct 06 08:55:38 crc kubenswrapper[4610]: I1006 08:55:38.417291 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" event={"ID":"e0af20a6-573f-4421-b9a4-5d5005a855b8","Type":"ContainerStarted","Data":"4ad5f844c3c9d803206f3534453ab3e92fd44a1893bce5898fb9d84ae177e27b"} Oct 06 08:55:38 crc kubenswrapper[4610]: I1006 08:55:38.417678 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:38 crc kubenswrapper[4610]: I1006 08:55:38.467618 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" podStartSLOduration=3.297943378 podStartE2EDuration="10.467596268s" podCreationTimestamp="2025-10-06 08:55:28 +0000 UTC" firstStartedPulling="2025-10-06 08:55:31.015689633 +0000 UTC m=+862.730743021" lastFinishedPulling="2025-10-06 08:55:38.185342523 +0000 UTC m=+869.900395911" observedRunningTime="2025-10-06 08:55:38.464407574 +0000 UTC m=+870.179460982" watchObservedRunningTime="2025-10-06 08:55:38.467596268 +0000 UTC m=+870.182649666" Oct 06 08:55:39 crc kubenswrapper[4610]: I1006 08:55:39.423133 4610 generic.go:334] "Generic (PLEG): container finished" podID="0f88e64c-929a-4a97-a3a1-a92face17060" containerID="885e77069935464b6a9d7c5e57ac2750155b610cca9ff5f4421e30e0a9019196" exitCode=0 Oct 06 08:55:39 crc kubenswrapper[4610]: I1006 08:55:39.424271 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerDied","Data":"885e77069935464b6a9d7c5e57ac2750155b610cca9ff5f4421e30e0a9019196"} Oct 06 08:55:40 crc kubenswrapper[4610]: I1006 08:55:40.435170 4610 generic.go:334] "Generic (PLEG): container finished" podID="0f88e64c-929a-4a97-a3a1-a92face17060" containerID="37ff617fb5686a4b8a00e7749699a1b5c6f13b05f3331cec0de04b0c125b9132" exitCode=0 Oct 06 08:55:40 crc kubenswrapper[4610]: I1006 08:55:40.435280 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerDied","Data":"37ff617fb5686a4b8a00e7749699a1b5c6f13b05f3331cec0de04b0c125b9132"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443390 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"cf7147e5ce43eeb62f995f46976beeb9bef5220d9a446b79e983f3408f35a648"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443749 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443763 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"bca82130de3ab3111aeb322cd603209df2852892c48dd2cf02d6ffd0246e0e26"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443775 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"15c31e5ccaf3e000c675f708010809bf7970ffafea67ac41cadcfe0d9c5790e9"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443786 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"a57e146d86eda565c7a716624569bd423674058201412eb81e1124b1fc0c4bac"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443813 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"0704d57ae1e401ddc7477449210cec59a4b4da7afe296f9c88f98e52db884cf1"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.443825 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rmk9w" event={"ID":"0f88e64c-929a-4a97-a3a1-a92face17060","Type":"ContainerStarted","Data":"44b210fd0dea826e2ebeff0407f4952860bec2b4d908481777896603dfc377e8"} Oct 06 08:55:41 crc kubenswrapper[4610]: I1006 08:55:41.471697 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rmk9w" podStartSLOduration=5.108962764 podStartE2EDuration="13.471676292s" podCreationTimestamp="2025-10-06 08:55:28 +0000 UTC" firstStartedPulling="2025-10-06 08:55:29.798382612 +0000 UTC m=+861.513436010" lastFinishedPulling="2025-10-06 08:55:38.16109615 +0000 UTC m=+869.876149538" observedRunningTime="2025-10-06 08:55:41.465634122 +0000 UTC m=+873.180687510" watchObservedRunningTime="2025-10-06 08:55:41.471676292 +0000 UTC m=+873.186729690" Oct 06 08:55:44 crc kubenswrapper[4610]: I1006 08:55:44.663514 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:44 crc kubenswrapper[4610]: I1006 08:55:44.712131 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:55:49 crc kubenswrapper[4610]: I1006 08:55:49.776297 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-855xl" Oct 06 08:55:50 crc kubenswrapper[4610]: I1006 08:55:50.557657 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-6pnsd" Oct 06 08:55:50 crc kubenswrapper[4610]: I1006 08:55:50.684307 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nssl6" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.589891 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.590570 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.593993 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.594413 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.596804 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nmbc6" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.602975 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.655786 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdst\" (UniqueName: \"kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst\") pod \"openstack-operator-index-tls7s\" (UID: \"6dcc5a70-b2d9-442f-bfa7-143c79194508\") " pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.757801 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdst\" (UniqueName: \"kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst\") pod \"openstack-operator-index-tls7s\" (UID: \"6dcc5a70-b2d9-442f-bfa7-143c79194508\") " pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.779823 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdst\" (UniqueName: \"kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst\") pod \"openstack-operator-index-tls7s\" (UID: \"6dcc5a70-b2d9-442f-bfa7-143c79194508\") " pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:53 crc kubenswrapper[4610]: I1006 08:55:53.936532 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:54 crc kubenswrapper[4610]: I1006 08:55:54.339232 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:54 crc kubenswrapper[4610]: W1006 08:55:54.353303 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcc5a70_b2d9_442f_bfa7_143c79194508.slice/crio-d928bb8b64aa8bfbf18259c6befc00ea1da33f4fd02ce51c90844868d712d562 WatchSource:0}: Error finding container d928bb8b64aa8bfbf18259c6befc00ea1da33f4fd02ce51c90844868d712d562: Status 404 returned error can't find the container with id d928bb8b64aa8bfbf18259c6befc00ea1da33f4fd02ce51c90844868d712d562 Oct 06 08:55:54 crc kubenswrapper[4610]: I1006 08:55:54.534741 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tls7s" event={"ID":"6dcc5a70-b2d9-442f-bfa7-143c79194508","Type":"ContainerStarted","Data":"d928bb8b64aa8bfbf18259c6befc00ea1da33f4fd02ce51c90844868d712d562"} Oct 06 08:55:57 crc kubenswrapper[4610]: I1006 08:55:57.551614 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:57 crc kubenswrapper[4610]: I1006 08:55:57.568341 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tls7s" event={"ID":"6dcc5a70-b2d9-442f-bfa7-143c79194508","Type":"ContainerStarted","Data":"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e"} Oct 06 08:55:57 crc kubenswrapper[4610]: I1006 08:55:57.587543 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tls7s" podStartSLOduration=2.338009387 podStartE2EDuration="4.587520822s" podCreationTimestamp="2025-10-06 08:55:53 +0000 UTC" firstStartedPulling="2025-10-06 08:55:54.354738493 +0000 UTC m=+886.069791881" lastFinishedPulling="2025-10-06 08:55:56.604249928 +0000 UTC m=+888.319303316" observedRunningTime="2025-10-06 08:55:57.583880036 +0000 UTC m=+889.298933434" watchObservedRunningTime="2025-10-06 08:55:57.587520822 +0000 UTC m=+889.302574210" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.158110 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qh9tx"] Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.158972 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.168639 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qh9tx"] Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.225014 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdz5h\" (UniqueName: \"kubernetes.io/projected/02ca177f-d4f8-419b-babe-caeb9a7272fe-kube-api-access-kdz5h\") pod \"openstack-operator-index-qh9tx\" (UID: \"02ca177f-d4f8-419b-babe-caeb9a7272fe\") " pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.327356 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdz5h\" (UniqueName: \"kubernetes.io/projected/02ca177f-d4f8-419b-babe-caeb9a7272fe-kube-api-access-kdz5h\") pod \"openstack-operator-index-qh9tx\" (UID: \"02ca177f-d4f8-419b-babe-caeb9a7272fe\") " pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.348449 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdz5h\" (UniqueName: \"kubernetes.io/projected/02ca177f-d4f8-419b-babe-caeb9a7272fe-kube-api-access-kdz5h\") pod \"openstack-operator-index-qh9tx\" (UID: \"02ca177f-d4f8-419b-babe-caeb9a7272fe\") " pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.494716 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.573518 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-tls7s" podUID="6dcc5a70-b2d9-442f-bfa7-143c79194508" containerName="registry-server" containerID="cri-o://e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e" gracePeriod=2 Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.934741 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:58 crc kubenswrapper[4610]: I1006 08:55:58.945944 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qh9tx"] Oct 06 08:55:58 crc kubenswrapper[4610]: W1006 08:55:58.957510 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ca177f_d4f8_419b_babe_caeb9a7272fe.slice/crio-cc78639dded197f6c141df887a13f013b7418003f4de0940aeab03af4e365ff5 WatchSource:0}: Error finding container cc78639dded197f6c141df887a13f013b7418003f4de0940aeab03af4e365ff5: Status 404 returned error can't find the container with id cc78639dded197f6c141df887a13f013b7418003f4de0940aeab03af4e365ff5 Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.035513 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zdst\" (UniqueName: \"kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst\") pod \"6dcc5a70-b2d9-442f-bfa7-143c79194508\" (UID: \"6dcc5a70-b2d9-442f-bfa7-143c79194508\") " Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.040516 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst" (OuterVolumeSpecName: "kube-api-access-5zdst") pod "6dcc5a70-b2d9-442f-bfa7-143c79194508" (UID: "6dcc5a70-b2d9-442f-bfa7-143c79194508"). InnerVolumeSpecName "kube-api-access-5zdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.137635 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zdst\" (UniqueName: \"kubernetes.io/projected/6dcc5a70-b2d9-442f-bfa7-143c79194508-kube-api-access-5zdst\") on node \"crc\" DevicePath \"\"" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.583230 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qh9tx" event={"ID":"02ca177f-d4f8-419b-babe-caeb9a7272fe","Type":"ContainerStarted","Data":"3324a2a4af9766f66bab91d0de2f2867967702776effaff9ffb23702a9ac8898"} Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.583571 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qh9tx" event={"ID":"02ca177f-d4f8-419b-babe-caeb9a7272fe","Type":"ContainerStarted","Data":"cc78639dded197f6c141df887a13f013b7418003f4de0940aeab03af4e365ff5"} Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.585608 4610 generic.go:334] "Generic (PLEG): container finished" podID="6dcc5a70-b2d9-442f-bfa7-143c79194508" containerID="e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e" exitCode=0 Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.585635 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tls7s" event={"ID":"6dcc5a70-b2d9-442f-bfa7-143c79194508","Type":"ContainerDied","Data":"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e"} Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.585661 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tls7s" event={"ID":"6dcc5a70-b2d9-442f-bfa7-143c79194508","Type":"ContainerDied","Data":"d928bb8b64aa8bfbf18259c6befc00ea1da33f4fd02ce51c90844868d712d562"} Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.585680 4610 scope.go:117] "RemoveContainer" containerID="e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.585676 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tls7s" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.606307 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qh9tx" podStartSLOduration=1.5545346260000001 podStartE2EDuration="1.606278638s" podCreationTimestamp="2025-10-06 08:55:58 +0000 UTC" firstStartedPulling="2025-10-06 08:55:58.96191482 +0000 UTC m=+890.676968218" lastFinishedPulling="2025-10-06 08:55:59.013658842 +0000 UTC m=+890.728712230" observedRunningTime="2025-10-06 08:55:59.604821079 +0000 UTC m=+891.319874497" watchObservedRunningTime="2025-10-06 08:55:59.606278638 +0000 UTC m=+891.321332066" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.625770 4610 scope.go:117] "RemoveContainer" containerID="e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e" Oct 06 08:55:59 crc kubenswrapper[4610]: E1006 08:55:59.626401 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e\": container with ID starting with e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e not found: ID does not exist" containerID="e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.626704 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e"} err="failed to get container status \"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e\": rpc error: code = NotFound desc = could not find container \"e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e\": container with ID starting with e5f1b3fcf138843d6eafa2c53b543130306250738d9fac125f7af9c0db17ca5e not found: ID does not exist" Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.629445 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.632671 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-tls7s"] Oct 06 08:55:59 crc kubenswrapper[4610]: I1006 08:55:59.667307 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rmk9w" Oct 06 08:56:01 crc kubenswrapper[4610]: I1006 08:56:01.081972 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dcc5a70-b2d9-442f-bfa7-143c79194508" path="/var/lib/kubelet/pods/6dcc5a70-b2d9-442f-bfa7-143c79194508/volumes" Oct 06 08:56:08 crc kubenswrapper[4610]: I1006 08:56:08.495540 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:56:08 crc kubenswrapper[4610]: I1006 08:56:08.495947 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:56:08 crc kubenswrapper[4610]: I1006 08:56:08.543939 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:56:08 crc kubenswrapper[4610]: I1006 08:56:08.673536 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qh9tx" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.496774 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm"] Oct 06 08:56:15 crc kubenswrapper[4610]: E1006 08:56:15.497190 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcc5a70-b2d9-442f-bfa7-143c79194508" containerName="registry-server" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.497201 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcc5a70-b2d9-442f-bfa7-143c79194508" containerName="registry-server" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.497317 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dcc5a70-b2d9-442f-bfa7-143c79194508" containerName="registry-server" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.498000 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.502252 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tq5qf" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.514974 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm"] Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.593935 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rd5\" (UniqueName: \"kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.594081 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.594146 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.695010 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.695175 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rd5\" (UniqueName: \"kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.695373 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.695709 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.696505 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.731847 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rd5\" (UniqueName: \"kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5\") pod \"cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:15 crc kubenswrapper[4610]: I1006 08:56:15.814427 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.220314 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm"] Oct 06 08:56:16 crc kubenswrapper[4610]: W1006 08:56:16.224220 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c9f18a_e16b_45ec_9d46_e879df2773ab.slice/crio-9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785 WatchSource:0}: Error finding container 9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785: Status 404 returned error can't find the container with id 9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785 Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.469218 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.469671 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.707213 4610 generic.go:334] "Generic (PLEG): container finished" podID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerID="703bb4afc0711095c3ccd2368cdbef801efc7ab53c31b77451f42679b447c947" exitCode=0 Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.707269 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" event={"ID":"a7c9f18a-e16b-45ec-9d46-e879df2773ab","Type":"ContainerDied","Data":"703bb4afc0711095c3ccd2368cdbef801efc7ab53c31b77451f42679b447c947"} Oct 06 08:56:16 crc kubenswrapper[4610]: I1006 08:56:16.707310 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" event={"ID":"a7c9f18a-e16b-45ec-9d46-e879df2773ab","Type":"ContainerStarted","Data":"9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785"} Oct 06 08:56:17 crc kubenswrapper[4610]: I1006 08:56:17.717378 4610 generic.go:334] "Generic (PLEG): container finished" podID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerID="4becb7fa523a3808672918a3a9e2beb89e6badcc7ef9182cfc845e357811e74e" exitCode=0 Oct 06 08:56:17 crc kubenswrapper[4610]: I1006 08:56:17.717625 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" event={"ID":"a7c9f18a-e16b-45ec-9d46-e879df2773ab","Type":"ContainerDied","Data":"4becb7fa523a3808672918a3a9e2beb89e6badcc7ef9182cfc845e357811e74e"} Oct 06 08:56:18 crc kubenswrapper[4610]: I1006 08:56:18.727976 4610 generic.go:334] "Generic (PLEG): container finished" podID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerID="167544ae4e07cde17799e6049f77c1cf987b0189b5f84599964087020a4a260b" exitCode=0 Oct 06 08:56:18 crc kubenswrapper[4610]: I1006 08:56:18.728026 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" event={"ID":"a7c9f18a-e16b-45ec-9d46-e879df2773ab","Type":"ContainerDied","Data":"167544ae4e07cde17799e6049f77c1cf987b0189b5f84599964087020a4a260b"} Oct 06 08:56:19 crc kubenswrapper[4610]: I1006 08:56:19.983232 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.052559 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6rd5\" (UniqueName: \"kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5\") pod \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.052601 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle\") pod \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.052661 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util\") pod \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\" (UID: \"a7c9f18a-e16b-45ec-9d46-e879df2773ab\") " Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.053289 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle" (OuterVolumeSpecName: "bundle") pod "a7c9f18a-e16b-45ec-9d46-e879df2773ab" (UID: "a7c9f18a-e16b-45ec-9d46-e879df2773ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.057168 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5" (OuterVolumeSpecName: "kube-api-access-r6rd5") pod "a7c9f18a-e16b-45ec-9d46-e879df2773ab" (UID: "a7c9f18a-e16b-45ec-9d46-e879df2773ab"). InnerVolumeSpecName "kube-api-access-r6rd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.066742 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util" (OuterVolumeSpecName: "util") pod "a7c9f18a-e16b-45ec-9d46-e879df2773ab" (UID: "a7c9f18a-e16b-45ec-9d46-e879df2773ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.154311 4610 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.154485 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6rd5\" (UniqueName: \"kubernetes.io/projected/a7c9f18a-e16b-45ec-9d46-e879df2773ab-kube-api-access-r6rd5\") on node \"crc\" DevicePath \"\"" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.154619 4610 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c9f18a-e16b-45ec-9d46-e879df2773ab-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.743712 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" event={"ID":"a7c9f18a-e16b-45ec-9d46-e879df2773ab","Type":"ContainerDied","Data":"9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785"} Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.744331 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b001272e03aab9f6bc6ab424f4d59e4bde246dab3716661a749b670b2e09785" Oct 06 08:56:20 crc kubenswrapper[4610]: I1006 08:56:20.744591 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.868387 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56"] Oct 06 08:56:27 crc kubenswrapper[4610]: E1006 08:56:27.868791 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="pull" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.868802 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="pull" Oct 06 08:56:27 crc kubenswrapper[4610]: E1006 08:56:27.868815 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="util" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.868821 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="util" Oct 06 08:56:27 crc kubenswrapper[4610]: E1006 08:56:27.868831 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="extract" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.868838 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="extract" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.868938 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c9f18a-e16b-45ec-9d46-e879df2773ab" containerName="extract" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.869675 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.871770 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ptnnb" Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.907255 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56"] Oct 06 08:56:27 crc kubenswrapper[4610]: I1006 08:56:27.956869 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzz8q\" (UniqueName: \"kubernetes.io/projected/f88899ff-f714-4c64-8a83-bf97a4c80c1b-kube-api-access-lzz8q\") pod \"openstack-operator-controller-operator-6497dff45c-kjs56\" (UID: \"f88899ff-f714-4c64-8a83-bf97a4c80c1b\") " pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:28 crc kubenswrapper[4610]: I1006 08:56:28.058395 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzz8q\" (UniqueName: \"kubernetes.io/projected/f88899ff-f714-4c64-8a83-bf97a4c80c1b-kube-api-access-lzz8q\") pod \"openstack-operator-controller-operator-6497dff45c-kjs56\" (UID: \"f88899ff-f714-4c64-8a83-bf97a4c80c1b\") " pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:28 crc kubenswrapper[4610]: I1006 08:56:28.092503 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzz8q\" (UniqueName: \"kubernetes.io/projected/f88899ff-f714-4c64-8a83-bf97a4c80c1b-kube-api-access-lzz8q\") pod \"openstack-operator-controller-operator-6497dff45c-kjs56\" (UID: \"f88899ff-f714-4c64-8a83-bf97a4c80c1b\") " pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:28 crc kubenswrapper[4610]: I1006 08:56:28.185504 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:28 crc kubenswrapper[4610]: I1006 08:56:28.676626 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56"] Oct 06 08:56:28 crc kubenswrapper[4610]: W1006 08:56:28.680931 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf88899ff_f714_4c64_8a83_bf97a4c80c1b.slice/crio-2ef4d45843221567ce4f8dbea2d74727020f4238c35b1ef4a5dcc01821d0a265 WatchSource:0}: Error finding container 2ef4d45843221567ce4f8dbea2d74727020f4238c35b1ef4a5dcc01821d0a265: Status 404 returned error can't find the container with id 2ef4d45843221567ce4f8dbea2d74727020f4238c35b1ef4a5dcc01821d0a265 Oct 06 08:56:28 crc kubenswrapper[4610]: I1006 08:56:28.799891 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" event={"ID":"f88899ff-f714-4c64-8a83-bf97a4c80c1b","Type":"ContainerStarted","Data":"2ef4d45843221567ce4f8dbea2d74727020f4238c35b1ef4a5dcc01821d0a265"} Oct 06 08:56:32 crc kubenswrapper[4610]: I1006 08:56:32.850310 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" event={"ID":"f88899ff-f714-4c64-8a83-bf97a4c80c1b","Type":"ContainerStarted","Data":"d462fc9fbcf5eee82b45d2b273838517fb3f1eec3c5b36f37b62cb7c860a43bd"} Oct 06 08:56:35 crc kubenswrapper[4610]: I1006 08:56:35.876113 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" event={"ID":"f88899ff-f714-4c64-8a83-bf97a4c80c1b","Type":"ContainerStarted","Data":"a13a31d64136e53660576df761cd70ebdd0189a5ad03cee52f170788af6feecb"} Oct 06 08:56:35 crc kubenswrapper[4610]: I1006 08:56:35.876418 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:35 crc kubenswrapper[4610]: I1006 08:56:35.909949 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" podStartSLOduration=2.60924889 podStartE2EDuration="8.909931886s" podCreationTimestamp="2025-10-06 08:56:27 +0000 UTC" firstStartedPulling="2025-10-06 08:56:28.68260943 +0000 UTC m=+920.397662818" lastFinishedPulling="2025-10-06 08:56:34.983292426 +0000 UTC m=+926.698345814" observedRunningTime="2025-10-06 08:56:35.906496594 +0000 UTC m=+927.621549982" watchObservedRunningTime="2025-10-06 08:56:35.909931886 +0000 UTC m=+927.624985274" Oct 06 08:56:38 crc kubenswrapper[4610]: I1006 08:56:38.189119 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6497dff45c-kjs56" Oct 06 08:56:46 crc kubenswrapper[4610]: I1006 08:56:46.468653 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:56:46 crc kubenswrapper[4610]: I1006 08:56:46.468982 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.544228 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.545667 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.548820 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.550128 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.557242 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6tgxq" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.558259 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.559156 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.562864 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8kg5h" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.563917 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gw252" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.591666 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.598227 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.612099 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-z448l"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.613302 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.616933 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4xmdf" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.642864 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-z448l"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.647408 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.682852 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtnd\" (UniqueName: \"kubernetes.io/projected/e8d37aed-cf46-47a0-a8ea-cfec57404966-kube-api-access-tqtnd\") pod \"barbican-operator-controller-manager-5f7c849b98-gcbb8\" (UID: \"e8d37aed-cf46-47a0-a8ea-cfec57404966\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.682957 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffr2n\" (UniqueName: \"kubernetes.io/projected/23749a1a-8450-4412-850b-1e044d290c69-kube-api-access-ffr2n\") pod \"designate-operator-controller-manager-75dfd9b554-f6dh9\" (UID: \"23749a1a-8450-4412-850b-1e044d290c69\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.683177 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4b5\" (UniqueName: \"kubernetes.io/projected/590d1736-08ea-4b24-9462-51e4f9eb2169-kube-api-access-hd4b5\") pod \"cinder-operator-controller-manager-7d4d4f8d-65ffb\" (UID: \"590d1736-08ea-4b24-9462-51e4f9eb2169\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.688471 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.690133 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.715370 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-smm4d" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.723098 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.724395 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.735510 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gslm6" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.757896 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.773990 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.775216 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.779878 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n7wf4" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.784403 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4b5\" (UniqueName: \"kubernetes.io/projected/590d1736-08ea-4b24-9462-51e4f9eb2169-kube-api-access-hd4b5\") pod \"cinder-operator-controller-manager-7d4d4f8d-65ffb\" (UID: \"590d1736-08ea-4b24-9462-51e4f9eb2169\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.784613 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rp4\" (UniqueName: \"kubernetes.io/projected/55d8474b-1188-4617-abe4-d5e45d9a85cb-kube-api-access-t2rp4\") pod \"glance-operator-controller-manager-5568b5d68-z448l\" (UID: \"55d8474b-1188-4617-abe4-d5e45d9a85cb\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.784713 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtnd\" (UniqueName: \"kubernetes.io/projected/e8d37aed-cf46-47a0-a8ea-cfec57404966-kube-api-access-tqtnd\") pod \"barbican-operator-controller-manager-5f7c849b98-gcbb8\" (UID: \"e8d37aed-cf46-47a0-a8ea-cfec57404966\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.784829 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffr2n\" (UniqueName: \"kubernetes.io/projected/23749a1a-8450-4412-850b-1e044d290c69-kube-api-access-ffr2n\") pod \"designate-operator-controller-manager-75dfd9b554-f6dh9\" (UID: \"23749a1a-8450-4412-850b-1e044d290c69\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.786008 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.799098 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.811297 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.825432 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4b5\" (UniqueName: \"kubernetes.io/projected/590d1736-08ea-4b24-9462-51e4f9eb2169-kube-api-access-hd4b5\") pod \"cinder-operator-controller-manager-7d4d4f8d-65ffb\" (UID: \"590d1736-08ea-4b24-9462-51e4f9eb2169\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.835181 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtnd\" (UniqueName: \"kubernetes.io/projected/e8d37aed-cf46-47a0-a8ea-cfec57404966-kube-api-access-tqtnd\") pod \"barbican-operator-controller-manager-5f7c849b98-gcbb8\" (UID: \"e8d37aed-cf46-47a0-a8ea-cfec57404966\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.848096 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffr2n\" (UniqueName: \"kubernetes.io/projected/23749a1a-8450-4412-850b-1e044d290c69-kube-api-access-ffr2n\") pod \"designate-operator-controller-manager-75dfd9b554-f6dh9\" (UID: \"23749a1a-8450-4412-850b-1e044d290c69\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.862342 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.879021 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.879161 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-575v4"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.880344 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.891265 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.892548 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kpr\" (UniqueName: \"kubernetes.io/projected/aa003bf3-ca26-468d-975a-5ceaa0361f14-kube-api-access-w8kpr\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.892787 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqntj\" (UniqueName: \"kubernetes.io/projected/08b5e994-103b-40ba-aef6-4dd36e5c456e-kube-api-access-nqntj\") pod \"heat-operator-controller-manager-8f58bc9db-xfwfl\" (UID: \"08b5e994-103b-40ba-aef6-4dd36e5c456e\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.892915 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.893097 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rp4\" (UniqueName: \"kubernetes.io/projected/55d8474b-1188-4617-abe4-d5e45d9a85cb-kube-api-access-t2rp4\") pod \"glance-operator-controller-manager-5568b5d68-z448l\" (UID: \"55d8474b-1188-4617-abe4-d5e45d9a85cb\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.893290 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzf4d\" (UniqueName: \"kubernetes.io/projected/b63b18e4-4aee-4a86-a5cb-23393a3cfaa3-kube-api-access-pzf4d\") pod \"horizon-operator-controller-manager-54876c876f-cqqbc\" (UID: \"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.896456 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8zxbs" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.911427 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-575v4"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.917274 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.918254 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.926792 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.928684 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.945955 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-dlzk7" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.946397 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xhvc6" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.963124 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rp4\" (UniqueName: \"kubernetes.io/projected/55d8474b-1188-4617-abe4-d5e45d9a85cb-kube-api-access-t2rp4\") pod \"glance-operator-controller-manager-5568b5d68-z448l\" (UID: \"55d8474b-1188-4617-abe4-d5e45d9a85cb\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.968179 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.972204 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl"] Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.995645 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqntj\" (UniqueName: \"kubernetes.io/projected/08b5e994-103b-40ba-aef6-4dd36e5c456e-kube-api-access-nqntj\") pod \"heat-operator-controller-manager-8f58bc9db-xfwfl\" (UID: \"08b5e994-103b-40ba-aef6-4dd36e5c456e\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.995680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.995733 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzf4d\" (UniqueName: \"kubernetes.io/projected/b63b18e4-4aee-4a86-a5cb-23393a3cfaa3-kube-api-access-pzf4d\") pod \"horizon-operator-controller-manager-54876c876f-cqqbc\" (UID: \"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.995758 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lxg\" (UniqueName: \"kubernetes.io/projected/8c2d89eb-7d33-4268-901a-69b008f224d4-kube-api-access-q6lxg\") pod \"ironic-operator-controller-manager-699b87f775-575v4\" (UID: \"8c2d89eb-7d33-4268-901a-69b008f224d4\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:12 crc kubenswrapper[4610]: I1006 08:57:12.995787 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kpr\" (UniqueName: \"kubernetes.io/projected/aa003bf3-ca26-468d-975a-5ceaa0361f14-kube-api-access-w8kpr\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:12 crc kubenswrapper[4610]: E1006 08:57:12.996240 4610 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 08:57:12 crc kubenswrapper[4610]: E1006 08:57:12.996283 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert podName:aa003bf3-ca26-468d-975a-5ceaa0361f14 nodeName:}" failed. No retries permitted until 2025-10-06 08:57:13.496267953 +0000 UTC m=+965.211321341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert") pod "infra-operator-controller-manager-658588b8c9-4hldf" (UID: "aa003bf3-ca26-468d-975a-5ceaa0361f14") : secret "infra-operator-webhook-server-cert" not found Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.028763 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.029967 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.033671 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wp8bl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.041763 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.042928 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.054587 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqntj\" (UniqueName: \"kubernetes.io/projected/08b5e994-103b-40ba-aef6-4dd36e5c456e-kube-api-access-nqntj\") pod \"heat-operator-controller-manager-8f58bc9db-xfwfl\" (UID: \"08b5e994-103b-40ba-aef6-4dd36e5c456e\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.058035 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzf4d\" (UniqueName: \"kubernetes.io/projected/b63b18e4-4aee-4a86-a5cb-23393a3cfaa3-kube-api-access-pzf4d\") pod \"horizon-operator-controller-manager-54876c876f-cqqbc\" (UID: \"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.061384 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2f4hk" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.066878 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kpr\" (UniqueName: \"kubernetes.io/projected/aa003bf3-ca26-468d-975a-5ceaa0361f14-kube-api-access-w8kpr\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.097946 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lxg\" (UniqueName: \"kubernetes.io/projected/8c2d89eb-7d33-4268-901a-69b008f224d4-kube-api-access-q6lxg\") pod \"ironic-operator-controller-manager-699b87f775-575v4\" (UID: \"8c2d89eb-7d33-4268-901a-69b008f224d4\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.098094 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25f2d\" (UniqueName: \"kubernetes.io/projected/2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629-kube-api-access-25f2d\") pod \"keystone-operator-controller-manager-655d88ccb9-7sghv\" (UID: \"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.098128 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqrs\" (UniqueName: \"kubernetes.io/projected/8ef0dcc5-529c-4a68-ba57-c68198a73de0-kube-api-access-5tqrs\") pod \"manila-operator-controller-manager-65d89cfd9f-6dqrl\" (UID: \"8ef0dcc5-529c-4a68-ba57-c68198a73de0\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.137269 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.137599 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.138678 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.139354 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.139561 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.139936 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.141211 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.149707 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qqgtv" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.152598 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.155021 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-r2c47" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.168590 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lxg\" (UniqueName: \"kubernetes.io/projected/8c2d89eb-7d33-4268-901a-69b008f224d4-kube-api-access-q6lxg\") pod \"ironic-operator-controller-manager-699b87f775-575v4\" (UID: \"8c2d89eb-7d33-4268-901a-69b008f224d4\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.200943 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msc7j\" (UniqueName: \"kubernetes.io/projected/0dfb923d-89c5-4fd0-af84-b73494c4cfc2-kube-api-access-msc7j\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-852df\" (UID: \"0dfb923d-89c5-4fd0-af84-b73494c4cfc2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.201028 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25f2d\" (UniqueName: \"kubernetes.io/projected/2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629-kube-api-access-25f2d\") pod \"keystone-operator-controller-manager-655d88ccb9-7sghv\" (UID: \"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.201113 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqrs\" (UniqueName: \"kubernetes.io/projected/8ef0dcc5-529c-4a68-ba57-c68198a73de0-kube-api-access-5tqrs\") pod \"manila-operator-controller-manager-65d89cfd9f-6dqrl\" (UID: \"8ef0dcc5-529c-4a68-ba57-c68198a73de0\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.201150 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7tk\" (UniqueName: \"kubernetes.io/projected/54de0bca-8a80-49a0-ae9f-0fe13fdeda11-kube-api-access-qt7tk\") pod \"neutron-operator-controller-manager-8d984cc4d-n7mj5\" (UID: \"54de0bca-8a80-49a0-ae9f-0fe13fdeda11\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.220787 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.221937 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.230812 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zz2zj" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.231165 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.239523 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.240499 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.247166 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.247412 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bv6rs" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.265324 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25f2d\" (UniqueName: \"kubernetes.io/projected/2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629-kube-api-access-25f2d\") pod \"keystone-operator-controller-manager-655d88ccb9-7sghv\" (UID: \"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.265675 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqrs\" (UniqueName: \"kubernetes.io/projected/8ef0dcc5-529c-4a68-ba57-c68198a73de0-kube-api-access-5tqrs\") pod \"manila-operator-controller-manager-65d89cfd9f-6dqrl\" (UID: \"8ef0dcc5-529c-4a68-ba57-c68198a73de0\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.277534 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.300112 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.301122 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.302362 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czvvd\" (UniqueName: \"kubernetes.io/projected/10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95-kube-api-access-czvvd\") pod \"ovn-operator-controller-manager-579449c7d5-j9k2v\" (UID: \"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.302444 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rpxq\" (UniqueName: \"kubernetes.io/projected/95dcc684-207d-4745-949b-d2bd559b9f18-kube-api-access-9rpxq\") pod \"nova-operator-controller-manager-7c7fc454ff-6h5w4\" (UID: \"95dcc684-207d-4745-949b-d2bd559b9f18\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.302493 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmd5q\" (UniqueName: \"kubernetes.io/projected/e30086d8-8211-4ef0-ae80-ec1d79719f51-kube-api-access-xmd5q\") pod \"octavia-operator-controller-manager-7468f855d8-5nhb8\" (UID: \"e30086d8-8211-4ef0-ae80-ec1d79719f51\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.302523 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msc7j\" (UniqueName: \"kubernetes.io/projected/0dfb923d-89c5-4fd0-af84-b73494c4cfc2-kube-api-access-msc7j\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-852df\" (UID: \"0dfb923d-89c5-4fd0-af84-b73494c4cfc2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.302594 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7tk\" (UniqueName: \"kubernetes.io/projected/54de0bca-8a80-49a0-ae9f-0fe13fdeda11-kube-api-access-qt7tk\") pod \"neutron-operator-controller-manager-8d984cc4d-n7mj5\" (UID: \"54de0bca-8a80-49a0-ae9f-0fe13fdeda11\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.311595 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.313131 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.325540 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fph9l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.370774 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7tk\" (UniqueName: \"kubernetes.io/projected/54de0bca-8a80-49a0-ae9f-0fe13fdeda11-kube-api-access-qt7tk\") pod \"neutron-operator-controller-manager-8d984cc4d-n7mj5\" (UID: \"54de0bca-8a80-49a0-ae9f-0fe13fdeda11\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.371217 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msc7j\" (UniqueName: \"kubernetes.io/projected/0dfb923d-89c5-4fd0-af84-b73494c4cfc2-kube-api-access-msc7j\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-852df\" (UID: \"0dfb923d-89c5-4fd0-af84-b73494c4cfc2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.371769 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.372450 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.372807 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.384880 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.398486 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407363 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407469 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czvvd\" (UniqueName: \"kubernetes.io/projected/10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95-kube-api-access-czvvd\") pod \"ovn-operator-controller-manager-579449c7d5-j9k2v\" (UID: \"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407572 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rpxq\" (UniqueName: \"kubernetes.io/projected/95dcc684-207d-4745-949b-d2bd559b9f18-kube-api-access-9rpxq\") pod \"nova-operator-controller-manager-7c7fc454ff-6h5w4\" (UID: \"95dcc684-207d-4745-949b-d2bd559b9f18\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407623 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7z6\" (UniqueName: \"kubernetes.io/projected/cc6ec685-6841-44c5-8315-462e605aa2d0-kube-api-access-pf7z6\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407649 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6pl\" (UniqueName: \"kubernetes.io/projected/ce2175a4-fac2-4259-91c9-6857fabd2755-kube-api-access-px6pl\") pod \"placement-operator-controller-manager-54689d9f88-x4rkd\" (UID: \"ce2175a4-fac2-4259-91c9-6857fabd2755\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.407677 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmd5q\" (UniqueName: \"kubernetes.io/projected/e30086d8-8211-4ef0-ae80-ec1d79719f51-kube-api-access-xmd5q\") pod \"octavia-operator-controller-manager-7468f855d8-5nhb8\" (UID: \"e30086d8-8211-4ef0-ae80-ec1d79719f51\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.452117 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.456778 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmd5q\" (UniqueName: \"kubernetes.io/projected/e30086d8-8211-4ef0-ae80-ec1d79719f51-kube-api-access-xmd5q\") pod \"octavia-operator-controller-manager-7468f855d8-5nhb8\" (UID: \"e30086d8-8211-4ef0-ae80-ec1d79719f51\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.458757 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rpxq\" (UniqueName: \"kubernetes.io/projected/95dcc684-207d-4745-949b-d2bd559b9f18-kube-api-access-9rpxq\") pod \"nova-operator-controller-manager-7c7fc454ff-6h5w4\" (UID: \"95dcc684-207d-4745-949b-d2bd559b9f18\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.460489 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czvvd\" (UniqueName: \"kubernetes.io/projected/10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95-kube-api-access-czvvd\") pod \"ovn-operator-controller-manager-579449c7d5-j9k2v\" (UID: \"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.479089 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.480259 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.483472 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-s55rg" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.487008 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.499858 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.500957 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.508446 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.508576 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7z6\" (UniqueName: \"kubernetes.io/projected/cc6ec685-6841-44c5-8315-462e605aa2d0-kube-api-access-pf7z6\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.508608 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6pl\" (UniqueName: \"kubernetes.io/projected/ce2175a4-fac2-4259-91c9-6857fabd2755-kube-api-access-px6pl\") pod \"placement-operator-controller-manager-54689d9f88-x4rkd\" (UID: \"ce2175a4-fac2-4259-91c9-6857fabd2755\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.508656 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:13 crc kubenswrapper[4610]: E1006 08:57:13.508810 4610 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 08:57:13 crc kubenswrapper[4610]: E1006 08:57:13.508868 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert podName:aa003bf3-ca26-468d-975a-5ceaa0361f14 nodeName:}" failed. No retries permitted until 2025-10-06 08:57:14.508848468 +0000 UTC m=+966.223901856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert") pod "infra-operator-controller-manager-658588b8c9-4hldf" (UID: "aa003bf3-ca26-468d-975a-5ceaa0361f14") : secret "infra-operator-webhook-server-cert" not found Oct 06 08:57:13 crc kubenswrapper[4610]: E1006 08:57:13.509275 4610 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:57:13 crc kubenswrapper[4610]: E1006 08:57:13.509316 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert podName:cc6ec685-6841-44c5-8315-462e605aa2d0 nodeName:}" failed. No retries permitted until 2025-10-06 08:57:14.00930332 +0000 UTC m=+965.724356708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" (UID: "cc6ec685-6841-44c5-8315-462e605aa2d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.524263 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xwr2j" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.527107 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.547965 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.559886 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6pl\" (UniqueName: \"kubernetes.io/projected/ce2175a4-fac2-4259-91c9-6857fabd2755-kube-api-access-px6pl\") pod \"placement-operator-controller-manager-54689d9f88-x4rkd\" (UID: \"ce2175a4-fac2-4259-91c9-6857fabd2755\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.560363 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vzgr7" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.572126 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7z6\" (UniqueName: \"kubernetes.io/projected/cc6ec685-6841-44c5-8315-462e605aa2d0-kube-api-access-pf7z6\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.598223 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.609944 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47zt\" (UniqueName: \"kubernetes.io/projected/f40be14e-8191-4b07-8f45-01a5d18ac504-kube-api-access-x47zt\") pod \"telemetry-operator-controller-manager-5d4d74dd89-df2ht\" (UID: \"f40be14e-8191-4b07-8f45-01a5d18ac504\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.610025 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnll\" (UniqueName: \"kubernetes.io/projected/15cb4fda-d42c-4ce7-a195-8476f589676e-kube-api-access-ktnll\") pod \"test-operator-controller-manager-5cd5cb47d7-sp9cd\" (UID: \"15cb4fda-d42c-4ce7-a195-8476f589676e\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.610089 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zlp\" (UniqueName: \"kubernetes.io/projected/becf25ed-9d23-4cfa-afe3-7301d5476a7d-kube-api-access-z5zlp\") pod \"swift-operator-controller-manager-6859f9b676-twqvt\" (UID: \"becf25ed-9d23-4cfa-afe3-7301d5476a7d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.632027 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.634206 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.652834 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.666764 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.683196 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.710132 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.711033 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x47zt\" (UniqueName: \"kubernetes.io/projected/f40be14e-8191-4b07-8f45-01a5d18ac504-kube-api-access-x47zt\") pod \"telemetry-operator-controller-manager-5d4d74dd89-df2ht\" (UID: \"f40be14e-8191-4b07-8f45-01a5d18ac504\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.713645 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnll\" (UniqueName: \"kubernetes.io/projected/15cb4fda-d42c-4ce7-a195-8476f589676e-kube-api-access-ktnll\") pod \"test-operator-controller-manager-5cd5cb47d7-sp9cd\" (UID: \"15cb4fda-d42c-4ce7-a195-8476f589676e\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.713706 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zlp\" (UniqueName: \"kubernetes.io/projected/becf25ed-9d23-4cfa-afe3-7301d5476a7d-kube-api-access-z5zlp\") pod \"swift-operator-controller-manager-6859f9b676-twqvt\" (UID: \"becf25ed-9d23-4cfa-afe3-7301d5476a7d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.714147 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.714248 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.724528 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-95hkq" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.737784 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnll\" (UniqueName: \"kubernetes.io/projected/15cb4fda-d42c-4ce7-a195-8476f589676e-kube-api-access-ktnll\") pod \"test-operator-controller-manager-5cd5cb47d7-sp9cd\" (UID: \"15cb4fda-d42c-4ce7-a195-8476f589676e\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.744026 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zlp\" (UniqueName: \"kubernetes.io/projected/becf25ed-9d23-4cfa-afe3-7301d5476a7d-kube-api-access-z5zlp\") pod \"swift-operator-controller-manager-6859f9b676-twqvt\" (UID: \"becf25ed-9d23-4cfa-afe3-7301d5476a7d\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.746590 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47zt\" (UniqueName: \"kubernetes.io/projected/f40be14e-8191-4b07-8f45-01a5d18ac504-kube-api-access-x47zt\") pod \"telemetry-operator-controller-manager-5d4d74dd89-df2ht\" (UID: \"f40be14e-8191-4b07-8f45-01a5d18ac504\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.757275 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.796431 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.815634 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98rt\" (UniqueName: \"kubernetes.io/projected/2296f857-2cd2-45d3-907c-94e9eb4262ab-kube-api-access-h98rt\") pod \"watcher-operator-controller-manager-6cbc6dd547-9lbhh\" (UID: \"2296f857-2cd2-45d3-907c-94e9eb4262ab\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.821154 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.822900 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.828155 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.828544 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cczft" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.833154 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.839593 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.853024 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.854342 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.857492 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-d55pp" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.858797 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.886167 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.911011 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb"] Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.914393 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.916584 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.916629 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqf4t\" (UniqueName: \"kubernetes.io/projected/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-kube-api-access-kqf4t\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.916661 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98rt\" (UniqueName: \"kubernetes.io/projected/2296f857-2cd2-45d3-907c-94e9eb4262ab-kube-api-access-h98rt\") pod \"watcher-operator-controller-manager-6cbc6dd547-9lbhh\" (UID: \"2296f857-2cd2-45d3-907c-94e9eb4262ab\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.916724 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4lq\" (UniqueName: \"kubernetes.io/projected/ae310e32-abf5-4646-a09d-bbf21cd33dc6-kube-api-access-qp4lq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d\" (UID: \"ae310e32-abf5-4646-a09d-bbf21cd33dc6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" Oct 06 08:57:13 crc kubenswrapper[4610]: I1006 08:57:13.953802 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98rt\" (UniqueName: \"kubernetes.io/projected/2296f857-2cd2-45d3-907c-94e9eb4262ab-kube-api-access-h98rt\") pod \"watcher-operator-controller-manager-6cbc6dd547-9lbhh\" (UID: \"2296f857-2cd2-45d3-907c-94e9eb4262ab\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.019988 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.020041 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqf4t\" (UniqueName: \"kubernetes.io/projected/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-kube-api-access-kqf4t\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.020153 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.020173 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4lq\" (UniqueName: \"kubernetes.io/projected/ae310e32-abf5-4646-a09d-bbf21cd33dc6-kube-api-access-qp4lq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d\" (UID: \"ae310e32-abf5-4646-a09d-bbf21cd33dc6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" Oct 06 08:57:14 crc kubenswrapper[4610]: E1006 08:57:14.020665 4610 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 06 08:57:14 crc kubenswrapper[4610]: E1006 08:57:14.020710 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert podName:533cbdde-bc4c-43b3-a9dd-e72d9b1aba90 nodeName:}" failed. No retries permitted until 2025-10-06 08:57:14.520695623 +0000 UTC m=+966.235749011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert") pod "openstack-operator-controller-manager-669d7f654d-zkg2w" (UID: "533cbdde-bc4c-43b3-a9dd-e72d9b1aba90") : secret "webhook-server-cert" not found Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.028318 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6ec685-6841-44c5-8315-462e605aa2d0-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l\" (UID: \"cc6ec685-6841-44c5-8315-462e605aa2d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.040582 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.045849 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqf4t\" (UniqueName: \"kubernetes.io/projected/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-kube-api-access-kqf4t\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.060243 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.061191 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4lq\" (UniqueName: \"kubernetes.io/projected/ae310e32-abf5-4646-a09d-bbf21cd33dc6-kube-api-access-qp4lq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d\" (UID: \"ae310e32-abf5-4646-a09d-bbf21cd33dc6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.105082 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.135105 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" event={"ID":"e8d37aed-cf46-47a0-a8ea-cfec57404966","Type":"ContainerStarted","Data":"ff8838dde54eb602e4c89f4cac267db1144e0bada2ea6a30bac1ef16414a7986"} Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.147136 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" event={"ID":"590d1736-08ea-4b24-9462-51e4f9eb2169","Type":"ContainerStarted","Data":"7cf9a9ab609b34bf891908636fd0ac237f92ccec07930ad403a64837f9216b9a"} Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.147949 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" event={"ID":"23749a1a-8450-4412-850b-1e044d290c69","Type":"ContainerStarted","Data":"c26c18a747ec60bbdc29d70f819b2926753abdb6c4eb34abd76e77ba727b3034"} Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.194431 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.386632 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-575v4"] Oct 06 08:57:14 crc kubenswrapper[4610]: W1006 08:57:14.409151 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2d89eb_7d33_4268_901a_69b008f224d4.slice/crio-0cc7c2c381299aa6b32bebf748f889ccc3c67911e60231145e9b2133402da216 WatchSource:0}: Error finding container 0cc7c2c381299aa6b32bebf748f889ccc3c67911e60231145e9b2133402da216: Status 404 returned error can't find the container with id 0cc7c2c381299aa6b32bebf748f889ccc3c67911e60231145e9b2133402da216 Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.513084 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.532396 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.532944 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.533008 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.539677 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/533cbdde-bc4c-43b3-a9dd-e72d9b1aba90-cert\") pod \"openstack-operator-controller-manager-669d7f654d-zkg2w\" (UID: \"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90\") " pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.543841 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa003bf3-ca26-468d-975a-5ceaa0361f14-cert\") pod \"infra-operator-controller-manager-658588b8c9-4hldf\" (UID: \"aa003bf3-ca26-468d-975a-5ceaa0361f14\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:14 crc kubenswrapper[4610]: W1006 08:57:14.544585 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcd4c17_2e7d_4a3f_91e1_e6542cb2e629.slice/crio-4d21530bc9ceddf143180459c93d4fba673b27b87edb08ee32ec72677d2de7c6 WatchSource:0}: Error finding container 4d21530bc9ceddf143180459c93d4fba673b27b87edb08ee32ec72677d2de7c6: Status 404 returned error can't find the container with id 4d21530bc9ceddf143180459c93d4fba673b27b87edb08ee32ec72677d2de7c6 Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.549312 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-z448l"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.600029 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.662560 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.686431 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.688747 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.720032 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.750738 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.950365 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.967024 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.972745 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd"] Oct 06 08:57:14 crc kubenswrapper[4610]: I1006 08:57:14.977331 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4"] Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.033421 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9rpxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-6h5w4_openstack-operators(95dcc684-207d-4745-949b-d2bd559b9f18): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.138795 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.148477 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.151384 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.165266 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" event={"ID":"8c2d89eb-7d33-4268-901a-69b008f224d4","Type":"ContainerStarted","Data":"0cc7c2c381299aa6b32bebf748f889ccc3c67911e60231145e9b2133402da216"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.168424 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" event={"ID":"54de0bca-8a80-49a0-ae9f-0fe13fdeda11","Type":"ContainerStarted","Data":"5b959497f95d6b052bfc6172ce3c50e0ecd33ce527f192a2dbe45222a3051d31"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.169800 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" event={"ID":"55d8474b-1188-4617-abe4-d5e45d9a85cb","Type":"ContainerStarted","Data":"557bea4e600bb5797b42aa71e2399f6163805224563671b623e36a91eb767512"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.173742 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.178960 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" event={"ID":"e30086d8-8211-4ef0-ae80-ec1d79719f51","Type":"ContainerStarted","Data":"f116e9e19c16fd9e7e9a9be11f81e161f9002974d2a7d2fefbfc742e2e82495e"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.184177 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" event={"ID":"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95","Type":"ContainerStarted","Data":"4dd401c6e31d9af3111ab7f57d9f07b5a1c1e57818da4e95419e841a9e9d64df"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.190664 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.205853 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.216211 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf"] Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.216611 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" event={"ID":"08b5e994-103b-40ba-aef6-4dd36e5c456e","Type":"ContainerStarted","Data":"dc4b9651d1a9df328c6af349461ec51769043d8ad974afd8b9fe44028ea7df8e"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.219551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" event={"ID":"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3","Type":"ContainerStarted","Data":"9f2fd47fd8c64f0efd51dee87a2d5beddedcce07ffce436ab9ef37c7699abd19"} Oct 06 08:57:15 crc kubenswrapper[4610]: W1006 08:57:15.224322 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecf25ed_9d23_4cfa_afe3_7301d5476a7d.slice/crio-6e2cb2c49c4b8b9486dffa508a083f3d5ed4c80e3c7cc25562a50d2cd3aad5b6 WatchSource:0}: Error finding container 6e2cb2c49c4b8b9486dffa508a083f3d5ed4c80e3c7cc25562a50d2cd3aad5b6: Status 404 returned error can't find the container with id 6e2cb2c49c4b8b9486dffa508a083f3d5ed4c80e3c7cc25562a50d2cd3aad5b6 Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.226721 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" event={"ID":"0dfb923d-89c5-4fd0-af84-b73494c4cfc2","Type":"ContainerStarted","Data":"723a1d2e437cb26c1a367fcce2168b89bd890fababe85dd402f7ae87dba8bd7a"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.229905 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w"] Oct 06 08:57:15 crc kubenswrapper[4610]: W1006 08:57:15.229922 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae310e32_abf5_4646_a09d_bbf21cd33dc6.slice/crio-ab87cf05a5a3281549fb10109dd5aa1d5285a4796445df1d40a582eaa12df8b0 WatchSource:0}: Error finding container ab87cf05a5a3281549fb10109dd5aa1d5285a4796445df1d40a582eaa12df8b0: Status 404 returned error can't find the container with id ab87cf05a5a3281549fb10109dd5aa1d5285a4796445df1d40a582eaa12df8b0 Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.230547 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8kpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-4hldf_openstack-operators(aa003bf3-ca26-468d-975a-5ceaa0361f14): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.230792 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5zlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-twqvt_openstack-operators(becf25ed-9d23-4cfa-afe3-7301d5476a7d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.233434 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qp4lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d_openstack-operators(ae310e32-abf5-4646-a09d-bbf21cd33dc6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.233512 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" event={"ID":"ce2175a4-fac2-4259-91c9-6857fabd2755","Type":"ContainerStarted","Data":"07ef47940c3433dab86345f47877bb2ce99d6a4d3cee5317ae026f46fa156f80"} Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.236309 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" podUID="95dcc684-207d-4745-949b-d2bd559b9f18" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.236480 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" podUID="ae310e32-abf5-4646-a09d-bbf21cd33dc6" Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.251808 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" event={"ID":"8ef0dcc5-529c-4a68-ba57-c68198a73de0","Type":"ContainerStarted","Data":"b5f9f3c7de2c2cf4e242521991774c5ac162aa968711b70d8b4150e9112123cb"} Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.252632 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h98rt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-9lbhh_openstack-operators(2296f857-2cd2-45d3-907c-94e9eb4262ab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.267412 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" event={"ID":"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629","Type":"ContainerStarted","Data":"4d21530bc9ceddf143180459c93d4fba673b27b87edb08ee32ec72677d2de7c6"} Oct 06 08:57:15 crc kubenswrapper[4610]: I1006 08:57:15.281541 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" event={"ID":"95dcc684-207d-4745-949b-d2bd559b9f18","Type":"ContainerStarted","Data":"71328ec1875c31f884f6f7c923c72a348af5920926b1b644ddb3f60a1c276a42"} Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.282132 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x47zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-df2ht_openstack-operators(f40be14e-8191-4b07-8f45-01a5d18ac504): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.290261 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" podUID="95dcc684-207d-4745-949b-d2bd559b9f18" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.609850 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" podUID="2296f857-2cd2-45d3-907c-94e9eb4262ab" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.676759 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" podUID="becf25ed-9d23-4cfa-afe3-7301d5476a7d" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.685354 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" podUID="aa003bf3-ca26-468d-975a-5ceaa0361f14" Oct 06 08:57:15 crc kubenswrapper[4610]: E1006 08:57:15.903861 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" podUID="f40be14e-8191-4b07-8f45-01a5d18ac504" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.305258 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" event={"ID":"2296f857-2cd2-45d3-907c-94e9eb4262ab","Type":"ContainerStarted","Data":"f7f8b1e36be6a2080a6103788e475f5f008309afa761bfb545fb9a3e67a740bb"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.305307 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" event={"ID":"2296f857-2cd2-45d3-907c-94e9eb4262ab","Type":"ContainerStarted","Data":"5a6f165673470af988ee388f8f0523f7e45a62a7ae9c60e56d0c7c363b6aca90"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.309007 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" podUID="2296f857-2cd2-45d3-907c-94e9eb4262ab" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.315908 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" event={"ID":"95dcc684-207d-4745-949b-d2bd559b9f18","Type":"ContainerStarted","Data":"4dd93c6bf9a18aeac8d9204ccf0ddb6a4aa9708ad9e4989b7c347c076b49c59c"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.319828 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" podUID="95dcc684-207d-4745-949b-d2bd559b9f18" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.333018 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" event={"ID":"f40be14e-8191-4b07-8f45-01a5d18ac504","Type":"ContainerStarted","Data":"0f2af157d659b5d0453cb126bc8f431cf899d334224ed8d26b69358eec4945a1"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.333077 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" event={"ID":"f40be14e-8191-4b07-8f45-01a5d18ac504","Type":"ContainerStarted","Data":"44c89e5e476077068ed88025e9d82c2a62921ec0860bc5c7f77c62353b41b9d7"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.334743 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" podUID="f40be14e-8191-4b07-8f45-01a5d18ac504" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.360082 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" event={"ID":"aa003bf3-ca26-468d-975a-5ceaa0361f14","Type":"ContainerStarted","Data":"9916a90de27535345051f698a4cad741a9cc38c3134a469e0299699c3b354b94"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.360141 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" event={"ID":"aa003bf3-ca26-468d-975a-5ceaa0361f14","Type":"ContainerStarted","Data":"b259b8566eb79cce97299d85797a09795fd914a8a5fbf11ba4f2570f955595e6"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.365927 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" podUID="aa003bf3-ca26-468d-975a-5ceaa0361f14" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.368547 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" event={"ID":"cc6ec685-6841-44c5-8315-462e605aa2d0","Type":"ContainerStarted","Data":"e172d805a011f630c804d1108a1e9e0431658f1b2edf0996cd062f76b76f017a"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.372486 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" event={"ID":"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90","Type":"ContainerStarted","Data":"e4e685f2f8dceb2ee95e445b7cd06c020d1787db30b23317a45ee67321c85ec2"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.372516 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" event={"ID":"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90","Type":"ContainerStarted","Data":"852abf731420c170dc1cde673b1e271459f59318241a9c858b76cd6e93463cfb"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.372526 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" event={"ID":"533cbdde-bc4c-43b3-a9dd-e72d9b1aba90","Type":"ContainerStarted","Data":"ea3018c6a59ea463452dba21c783c8e6feabce1072ae29695d4289b275ded205"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.373103 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.415388 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" event={"ID":"becf25ed-9d23-4cfa-afe3-7301d5476a7d","Type":"ContainerStarted","Data":"43208778856cf090a054d402644211722faf2d0b946b5baddc4b1db55cb995f3"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.415441 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" event={"ID":"becf25ed-9d23-4cfa-afe3-7301d5476a7d","Type":"ContainerStarted","Data":"6e2cb2c49c4b8b9486dffa508a083f3d5ed4c80e3c7cc25562a50d2cd3aad5b6"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.419773 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" podUID="becf25ed-9d23-4cfa-afe3-7301d5476a7d" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.422525 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" event={"ID":"ae310e32-abf5-4646-a09d-bbf21cd33dc6","Type":"ContainerStarted","Data":"ab87cf05a5a3281549fb10109dd5aa1d5285a4796445df1d40a582eaa12df8b0"} Oct 06 08:57:16 crc kubenswrapper[4610]: E1006 08:57:16.424753 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" podUID="ae310e32-abf5-4646-a09d-bbf21cd33dc6" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.425921 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" event={"ID":"15cb4fda-d42c-4ce7-a195-8476f589676e","Type":"ContainerStarted","Data":"f3a5f7e8ae86b7184f121caa001c5360d093282e6b2b09535201a35846cc458c"} Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.469340 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.469406 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.469464 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.470159 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.470226 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2" gracePeriod=600 Oct 06 08:57:16 crc kubenswrapper[4610]: I1006 08:57:16.562525 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" podStartSLOduration=3.5625022299999998 podStartE2EDuration="3.56250223s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:57:16.476178047 +0000 UTC m=+968.191231445" watchObservedRunningTime="2025-10-06 08:57:16.56250223 +0000 UTC m=+968.277555618" Oct 06 08:57:17 crc kubenswrapper[4610]: I1006 08:57:17.436152 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2" exitCode=0 Oct 06 08:57:17 crc kubenswrapper[4610]: I1006 08:57:17.436590 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2"} Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.440087 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" podUID="f40be14e-8191-4b07-8f45-01a5d18ac504" Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.440900 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" podUID="95dcc684-207d-4745-949b-d2bd559b9f18" Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.442248 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" podUID="2296f857-2cd2-45d3-907c-94e9eb4262ab" Oct 06 08:57:17 crc kubenswrapper[4610]: I1006 08:57:17.448713 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9"} Oct 06 08:57:17 crc kubenswrapper[4610]: I1006 08:57:17.448861 4610 scope.go:117] "RemoveContainer" containerID="38f8706bf8b9b80033ad9a39fb7a4758655b1b4513afef87a5de3c844e2a88e6" Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.464054 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" podUID="aa003bf3-ca26-468d-975a-5ceaa0361f14" Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.464112 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" podUID="becf25ed-9d23-4cfa-afe3-7301d5476a7d" Oct 06 08:57:17 crc kubenswrapper[4610]: E1006 08:57:17.464154 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" podUID="ae310e32-abf5-4646-a09d-bbf21cd33dc6" Oct 06 08:57:24 crc kubenswrapper[4610]: I1006 08:57:24.697780 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-669d7f654d-zkg2w" Oct 06 08:57:27 crc kubenswrapper[4610]: E1006 08:57:27.437208 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f9fc3cf4084a325d7f5f9773bfcc2b839ccff1c72e61fdd8f410a7ef46497f75" Oct 06 08:57:27 crc kubenswrapper[4610]: E1006 08:57:27.438224 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f9fc3cf4084a325d7f5f9773bfcc2b839ccff1c72e61fdd8f410a7ef46497f75,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqtnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-5f7c849b98-gcbb8_openstack-operators(e8d37aed-cf46-47a0-a8ea-cfec57404966): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.119242 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.119872 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-czvvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-579449c7d5-j9k2v_openstack-operators(10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.588069 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.588253 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2rp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5568b5d68-z448l_openstack-operators(55d8474b-1188-4617-abe4-d5e45d9a85cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.946832 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1" Oct 06 08:57:29 crc kubenswrapper[4610]: E1006 08:57:29.947072 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-px6pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-x4rkd_openstack-operators(ce2175a4-fac2-4259-91c9-6857fabd2755): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.373000 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.373246 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xmd5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-5nhb8_openstack-operators(e30086d8-8211-4ef0-ae80-ec1d79719f51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.704115 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" podUID="55d8474b-1188-4617-abe4-d5e45d9a85cb" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.734576 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" podUID="10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.758458 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" podUID="ce2175a4-fac2-4259-91c9-6857fabd2755" Oct 06 08:57:30 crc kubenswrapper[4610]: E1006 08:57:30.837444 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" podUID="e8d37aed-cf46-47a0-a8ea-cfec57404966" Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.035488 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" podUID="e30086d8-8211-4ef0-ae80-ec1d79719f51" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.569851 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" event={"ID":"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3","Type":"ContainerStarted","Data":"091bdc2da9746aab15689b0be280c4ff4518504353106b871927d7ede2f5592e"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.589300 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" event={"ID":"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95","Type":"ContainerStarted","Data":"186345a9f9b0cbfa52cd835d74dcaadaeff8db925009600d0e8d7f3060eac68b"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.598650 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" event={"ID":"ce2175a4-fac2-4259-91c9-6857fabd2755","Type":"ContainerStarted","Data":"80190c5ea21f41bf46489841f1f08f89d69a592a814aabd8f20191753f5952fd"} Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.601711 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" podUID="ce2175a4-fac2-4259-91c9-6857fabd2755" Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.601826 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" podUID="10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.621672 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" event={"ID":"e30086d8-8211-4ef0-ae80-ec1d79719f51","Type":"ContainerStarted","Data":"2aae2473a9673907d201eb7f5f97ca27f52ec404bb0f43f513d3bdfbbe2990c3"} Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.623110 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" podUID="e30086d8-8211-4ef0-ae80-ec1d79719f51" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.646215 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" event={"ID":"e8d37aed-cf46-47a0-a8ea-cfec57404966","Type":"ContainerStarted","Data":"fb4b06ccca0fee8e46458cdadf296b4f9d7785d6fd9792a3e196b369050d325d"} Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.653755 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f9fc3cf4084a325d7f5f9773bfcc2b839ccff1c72e61fdd8f410a7ef46497f75\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" podUID="e8d37aed-cf46-47a0-a8ea-cfec57404966" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.681676 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" event={"ID":"54de0bca-8a80-49a0-ae9f-0fe13fdeda11","Type":"ContainerStarted","Data":"b9f5d6975dc6abe74f7c6f240ebb1d9215df95d8e8404cd97a8cc5dbb44990d3"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.691684 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" event={"ID":"23749a1a-8450-4412-850b-1e044d290c69","Type":"ContainerStarted","Data":"9dc81150bf8a8c8c97f9d6d3f292f18e8a858de4c2455d030597b495e51379e9"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.709313 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" event={"ID":"590d1736-08ea-4b24-9462-51e4f9eb2169","Type":"ContainerStarted","Data":"98072bb3786f7ed16bfaf350fe971c86d4a131627b9e07456b8e0a0e4941b9e0"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.741607 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" event={"ID":"8ef0dcc5-529c-4a68-ba57-c68198a73de0","Type":"ContainerStarted","Data":"000de6ee91ed68e239e49b3d8b6a87d7b336deb1d12915c9548f39375b6099bf"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.744457 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" event={"ID":"55d8474b-1188-4617-abe4-d5e45d9a85cb","Type":"ContainerStarted","Data":"28c695d7d24d6ce56fad9a46235535e0c3759bae3c93cd7d3693e89cbb865433"} Oct 06 08:57:31 crc kubenswrapper[4610]: E1006 08:57:31.746873 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" podUID="55d8474b-1188-4617-abe4-d5e45d9a85cb" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.759173 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" event={"ID":"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629","Type":"ContainerStarted","Data":"bd8f8183339c7bf1adf27fa2b5bab463134c5862e24ee4a9d9ff74f5337dcb6b"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.786308 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" event={"ID":"08b5e994-103b-40ba-aef6-4dd36e5c456e","Type":"ContainerStarted","Data":"4f63496b661faaaca8bd811d1f47529aa37cc05b3234bdcb5ba25d994e576cf1"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.786379 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" event={"ID":"08b5e994-103b-40ba-aef6-4dd36e5c456e","Type":"ContainerStarted","Data":"6ddbf92381eb1ffa15ee333e125e99948f005fb5872f534a237c225820d38359"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.792491 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.792785 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" event={"ID":"cc6ec685-6841-44c5-8315-462e605aa2d0","Type":"ContainerStarted","Data":"1061d29c7999bbc8ca0d6a64974c0636744d384628654e419774f8f28a649168"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.793372 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.800900 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" event={"ID":"8c2d89eb-7d33-4268-901a-69b008f224d4","Type":"ContainerStarted","Data":"444f2a5bb81df0fd9dbf9e2e34d50ce8bfa79be5e37e5ff8b921e24b550a1f7f"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.801522 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.803316 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" event={"ID":"15cb4fda-d42c-4ce7-a195-8476f589676e","Type":"ContainerStarted","Data":"bb5fd880782af7fd6a2dffe8dbbfd8af6c60ffc6572411e4907a4348ea6d5a70"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.803339 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" event={"ID":"15cb4fda-d42c-4ce7-a195-8476f589676e","Type":"ContainerStarted","Data":"6262e719bdea0a972028b3f1ba1200b37216b83a13da91f7197859d40cc82682"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.803700 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.816147 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" event={"ID":"0dfb923d-89c5-4fd0-af84-b73494c4cfc2","Type":"ContainerStarted","Data":"31a3b7c7e3d08f1805649ae54b0e3e7d9d037fe73df84bd8a0f045079629f450"} Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.905815 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" podStartSLOduration=4.177170505 podStartE2EDuration="19.90579826s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.670753461 +0000 UTC m=+966.385806849" lastFinishedPulling="2025-10-06 08:57:30.399381216 +0000 UTC m=+982.114434604" observedRunningTime="2025-10-06 08:57:31.899772309 +0000 UTC m=+983.614825697" watchObservedRunningTime="2025-10-06 08:57:31.90579826 +0000 UTC m=+983.620851648" Oct 06 08:57:31 crc kubenswrapper[4610]: I1006 08:57:31.991065 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" podStartSLOduration=4.7621971720000005 podStartE2EDuration="19.991036104s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.172790874 +0000 UTC m=+966.887844262" lastFinishedPulling="2025-10-06 08:57:30.401629806 +0000 UTC m=+982.116683194" observedRunningTime="2025-10-06 08:57:31.9815556 +0000 UTC m=+983.696609018" watchObservedRunningTime="2025-10-06 08:57:31.991036104 +0000 UTC m=+983.706089502" Oct 06 08:57:32 crc kubenswrapper[4610]: I1006 08:57:32.017889 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" podStartSLOduration=4.016307856 podStartE2EDuration="20.017865403s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.423515107 +0000 UTC m=+966.138568495" lastFinishedPulling="2025-10-06 08:57:30.425072664 +0000 UTC m=+982.140126042" observedRunningTime="2025-10-06 08:57:32.009189611 +0000 UTC m=+983.724243009" watchObservedRunningTime="2025-10-06 08:57:32.017865403 +0000 UTC m=+983.732918801" Oct 06 08:57:32 crc kubenswrapper[4610]: I1006 08:57:32.041163 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" podStartSLOduration=3.874047628 podStartE2EDuration="19.041145417s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.255821778 +0000 UTC m=+966.970875166" lastFinishedPulling="2025-10-06 08:57:30.422919567 +0000 UTC m=+982.137972955" observedRunningTime="2025-10-06 08:57:32.035688391 +0000 UTC m=+983.750741779" watchObservedRunningTime="2025-10-06 08:57:32.041145417 +0000 UTC m=+983.756198805" Oct 06 08:57:32 crc kubenswrapper[4610]: I1006 08:57:32.828561 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" event={"ID":"cc6ec685-6841-44c5-8315-462e605aa2d0","Type":"ContainerStarted","Data":"5c9fff7de6dcbd4cdd58d29acfb32b94eb779ca56ed6d6b6d320fc7593b70437"} Oct 06 08:57:32 crc kubenswrapper[4610]: I1006 08:57:32.831358 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" event={"ID":"8c2d89eb-7d33-4268-901a-69b008f224d4","Type":"ContainerStarted","Data":"6490f4f10569ecf502416cc83389781d57a6114d6c2e68e4484f5b7f4296f61d"} Oct 06 08:57:32 crc kubenswrapper[4610]: I1006 08:57:32.833997 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" event={"ID":"590d1736-08ea-4b24-9462-51e4f9eb2169","Type":"ContainerStarted","Data":"969892a31d965d0d512722ffe8c80026a70705437fbd98914fb477160286662c"} Oct 06 08:57:32 crc kubenswrapper[4610]: E1006 08:57:32.835394 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" podUID="ce2175a4-fac2-4259-91c9-6857fabd2755" Oct 06 08:57:32 crc kubenswrapper[4610]: E1006 08:57:32.837946 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f923b76c1dd8fde02a5faf8a0a433cfacfb7b743f371de64a12e30d6efcde254\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" podUID="10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95" Oct 06 08:57:32 crc kubenswrapper[4610]: E1006 08:57:32.838195 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f9fc3cf4084a325d7f5f9773bfcc2b839ccff1c72e61fdd8f410a7ef46497f75\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" podUID="e8d37aed-cf46-47a0-a8ea-cfec57404966" Oct 06 08:57:32 crc kubenswrapper[4610]: E1006 08:57:32.838362 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" podUID="e30086d8-8211-4ef0-ae80-ec1d79719f51" Oct 06 08:57:32 crc kubenswrapper[4610]: E1006 08:57:32.838421 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" podUID="55d8474b-1188-4617-abe4-d5e45d9a85cb" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.841953 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" event={"ID":"2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629","Type":"ContainerStarted","Data":"e830ac6c6d20fdf9164127c6e4425cd96d71e5fb769162d8c4fa0a640a29b736"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.842194 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.844217 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" event={"ID":"b63b18e4-4aee-4a86-a5cb-23393a3cfaa3","Type":"ContainerStarted","Data":"636bb1b87d8a6a25f9bcba395ee2c4e7292847bbc4ba38d9aac3c98c034ff59f"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.844847 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.847144 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" event={"ID":"54de0bca-8a80-49a0-ae9f-0fe13fdeda11","Type":"ContainerStarted","Data":"b68eea49de36e1086f7b7778aa3e3f1954f9303fca9736c1eafe3215ffe88689"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.847515 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.850095 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" event={"ID":"23749a1a-8450-4412-850b-1e044d290c69","Type":"ContainerStarted","Data":"14d8dd1bf836897ecc22922bccdcec47bbad946e6d45b02b30a249c68322b157"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.850508 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.854280 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" event={"ID":"0dfb923d-89c5-4fd0-af84-b73494c4cfc2","Type":"ContainerStarted","Data":"ec136766021e3405d5e1695ddb7239a35f11822ba469d6fd15b582ca351ddc8c"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.854993 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.869170 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" event={"ID":"8ef0dcc5-529c-4a68-ba57-c68198a73de0","Type":"ContainerStarted","Data":"47f64a56fe544bc39083ff52ba31f50618eca11a36af73cd4882c611e254f8f1"} Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.869554 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.869669 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.872479 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" podStartSLOduration=6.002732771 podStartE2EDuration="21.872456956s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.570224168 +0000 UTC m=+966.285277556" lastFinishedPulling="2025-10-06 08:57:30.439948353 +0000 UTC m=+982.155001741" observedRunningTime="2025-10-06 08:57:33.865397727 +0000 UTC m=+985.580451125" watchObservedRunningTime="2025-10-06 08:57:33.872456956 +0000 UTC m=+985.587510344" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.901733 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" podStartSLOduration=5.631302199 podStartE2EDuration="21.90170489s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.133584318 +0000 UTC m=+965.848637706" lastFinishedPulling="2025-10-06 08:57:30.403987009 +0000 UTC m=+982.119040397" observedRunningTime="2025-10-06 08:57:33.886122873 +0000 UTC m=+985.601176261" watchObservedRunningTime="2025-10-06 08:57:33.90170489 +0000 UTC m=+985.616758288" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.913704 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" podStartSLOduration=6.224860433 podStartE2EDuration="21.913679761s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.723101474 +0000 UTC m=+966.438154862" lastFinishedPulling="2025-10-06 08:57:30.411920802 +0000 UTC m=+982.126974190" observedRunningTime="2025-10-06 08:57:33.905195804 +0000 UTC m=+985.620249212" watchObservedRunningTime="2025-10-06 08:57:33.913679761 +0000 UTC m=+985.628733149" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.928091 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" podStartSLOduration=6.007734356 podStartE2EDuration="21.928070197s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.518872022 +0000 UTC m=+966.233925410" lastFinishedPulling="2025-10-06 08:57:30.439207863 +0000 UTC m=+982.154261251" observedRunningTime="2025-10-06 08:57:33.921887771 +0000 UTC m=+985.636941159" watchObservedRunningTime="2025-10-06 08:57:33.928070197 +0000 UTC m=+985.643123575" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.941498 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" podStartSLOduration=6.281720027 podStartE2EDuration="21.941483856s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.765260614 +0000 UTC m=+966.480314002" lastFinishedPulling="2025-10-06 08:57:30.425024453 +0000 UTC m=+982.140077831" observedRunningTime="2025-10-06 08:57:33.938902427 +0000 UTC m=+985.653955805" watchObservedRunningTime="2025-10-06 08:57:33.941483856 +0000 UTC m=+985.656537234" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.965999 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" podStartSLOduration=5.594658386 podStartE2EDuration="21.965977142s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.055455544 +0000 UTC m=+965.770508932" lastFinishedPulling="2025-10-06 08:57:30.4267743 +0000 UTC m=+982.141827688" observedRunningTime="2025-10-06 08:57:33.955607374 +0000 UTC m=+985.670660762" watchObservedRunningTime="2025-10-06 08:57:33.965977142 +0000 UTC m=+985.681030530" Oct 06 08:57:33 crc kubenswrapper[4610]: I1006 08:57:33.970542 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" podStartSLOduration=6.528011966 podStartE2EDuration="21.970524364s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.980477591 +0000 UTC m=+966.695530979" lastFinishedPulling="2025-10-06 08:57:30.422989989 +0000 UTC m=+982.138043377" observedRunningTime="2025-10-06 08:57:33.969396844 +0000 UTC m=+985.684450242" watchObservedRunningTime="2025-10-06 08:57:33.970524364 +0000 UTC m=+985.685577752" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.905419 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" event={"ID":"95dcc684-207d-4745-949b-d2bd559b9f18","Type":"ContainerStarted","Data":"5e598acbdeed703a90619c2d4f49e003377b953f1b1c729df4a4cbb42aec97cc"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.905898 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.907213 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" event={"ID":"ae310e32-abf5-4646-a09d-bbf21cd33dc6","Type":"ContainerStarted","Data":"5622722770c40068b9906e83a02477ecd6f977b822bc8501db8d3d7302452b2d"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.908957 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" event={"ID":"f40be14e-8191-4b07-8f45-01a5d18ac504","Type":"ContainerStarted","Data":"a210f778b6ec61b23415f7f46e9213cf4f995a8c1a438be0e74bc4c2735ca071"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.909339 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.911305 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" event={"ID":"aa003bf3-ca26-468d-975a-5ceaa0361f14","Type":"ContainerStarted","Data":"6c156c35191e0eb4056ba2293f1ff1be6191d0fefd31696278220a488190ebfe"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.911636 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.918250 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" event={"ID":"2296f857-2cd2-45d3-907c-94e9eb4262ab","Type":"ContainerStarted","Data":"e08f4a5917749d2fd525006a0dc081b53a3860ec46456174dc006143fcc0dad7"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.918686 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.927355 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" event={"ID":"becf25ed-9d23-4cfa-afe3-7301d5476a7d","Type":"ContainerStarted","Data":"39131af6dab0156390cfd18d8249c7170c354b1a3e59d7b328c5cd8850ac12ff"} Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.927848 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.932938 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" podStartSLOduration=4.10345424 podStartE2EDuration="25.932924426s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.033286306 +0000 UTC m=+966.748339684" lastFinishedPulling="2025-10-06 08:57:36.862756462 +0000 UTC m=+988.577809870" observedRunningTime="2025-10-06 08:57:37.932445713 +0000 UTC m=+989.647499111" watchObservedRunningTime="2025-10-06 08:57:37.932924426 +0000 UTC m=+989.647977824" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.951612 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" podStartSLOduration=3.227959176 podStartE2EDuration="24.951589236s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.230676385 +0000 UTC m=+966.945729773" lastFinishedPulling="2025-10-06 08:57:36.954306445 +0000 UTC m=+988.669359833" observedRunningTime="2025-10-06 08:57:37.951416652 +0000 UTC m=+989.666470060" watchObservedRunningTime="2025-10-06 08:57:37.951589236 +0000 UTC m=+989.666642634" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.972265 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d" podStartSLOduration=3.344421076 podStartE2EDuration="24.97224508s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.233356926 +0000 UTC m=+966.948410304" lastFinishedPulling="2025-10-06 08:57:36.86118092 +0000 UTC m=+988.576234308" observedRunningTime="2025-10-06 08:57:37.964709868 +0000 UTC m=+989.679763266" watchObservedRunningTime="2025-10-06 08:57:37.97224508 +0000 UTC m=+989.687298478" Oct 06 08:57:37 crc kubenswrapper[4610]: I1006 08:57:37.988255 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" podStartSLOduration=3.408958745 podStartE2EDuration="24.988236828s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.2820087 +0000 UTC m=+966.997062088" lastFinishedPulling="2025-10-06 08:57:36.861286783 +0000 UTC m=+988.576340171" observedRunningTime="2025-10-06 08:57:37.983036399 +0000 UTC m=+989.698089817" watchObservedRunningTime="2025-10-06 08:57:37.988236828 +0000 UTC m=+989.703290226" Oct 06 08:57:38 crc kubenswrapper[4610]: I1006 08:57:38.007572 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" podStartSLOduration=4.376645861 podStartE2EDuration="26.007553426s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.230447809 +0000 UTC m=+966.945501197" lastFinishedPulling="2025-10-06 08:57:36.861355384 +0000 UTC m=+988.576408762" observedRunningTime="2025-10-06 08:57:38.004847523 +0000 UTC m=+989.719900911" watchObservedRunningTime="2025-10-06 08:57:38.007553426 +0000 UTC m=+989.722606824" Oct 06 08:57:38 crc kubenswrapper[4610]: I1006 08:57:38.025640 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" podStartSLOduration=3.396810191 podStartE2EDuration="25.02562143s" podCreationTimestamp="2025-10-06 08:57:13 +0000 UTC" firstStartedPulling="2025-10-06 08:57:15.25250376 +0000 UTC m=+966.967557148" lastFinishedPulling="2025-10-06 08:57:36.881314979 +0000 UTC m=+988.596368387" observedRunningTime="2025-10-06 08:57:38.021403987 +0000 UTC m=+989.736457375" watchObservedRunningTime="2025-10-06 08:57:38.02562143 +0000 UTC m=+989.740674818" Oct 06 08:57:42 crc kubenswrapper[4610]: I1006 08:57:42.882773 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-65ffb" Oct 06 08:57:42 crc kubenswrapper[4610]: I1006 08:57:42.894882 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-f6dh9" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.282535 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-575v4" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.315482 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-7sghv" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.376448 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-6dqrl" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.376856 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-cqqbc" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.377924 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-xfwfl" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.406159 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-n7mj5" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.670733 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-852df" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.761623 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6h5w4" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.799100 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-twqvt" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.844655 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-df2ht" Oct 06 08:57:43 crc kubenswrapper[4610]: I1006 08:57:43.919680 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-sp9cd" Oct 06 08:57:44 crc kubenswrapper[4610]: I1006 08:57:44.063743 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-9lbhh" Oct 06 08:57:44 crc kubenswrapper[4610]: I1006 08:57:44.204289 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l" Oct 06 08:57:44 crc kubenswrapper[4610]: I1006 08:57:44.612876 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-4hldf" Oct 06 08:57:47 crc kubenswrapper[4610]: I1006 08:57:47.073272 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.012759 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" event={"ID":"55d8474b-1188-4617-abe4-d5e45d9a85cb","Type":"ContainerStarted","Data":"661a9b3c9526fb0dedc68621de34b7bfe73b4a0f7de4d965d0949135d8300225"} Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.013204 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.014945 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" event={"ID":"e30086d8-8211-4ef0-ae80-ec1d79719f51","Type":"ContainerStarted","Data":"1237857309e2766e1fc5053587e79f0da3b2f1b82b1ecd34ecb4b4b6bac2e1fc"} Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.015182 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.016664 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" event={"ID":"10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95","Type":"ContainerStarted","Data":"8d8f17508f28a85bd9cbd97b5b9dbcbf2931b5e3d7e968f58fe48523063c95a6"} Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.016886 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.028803 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" podStartSLOduration=2.833260757 podStartE2EDuration="36.028787064s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.553803138 +0000 UTC m=+966.268856526" lastFinishedPulling="2025-10-06 08:57:47.749329445 +0000 UTC m=+999.464382833" observedRunningTime="2025-10-06 08:57:48.027392116 +0000 UTC m=+999.742445524" watchObservedRunningTime="2025-10-06 08:57:48.028787064 +0000 UTC m=+999.743840452" Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.042224 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" podStartSLOduration=3.490564828 podStartE2EDuration="36.042206363s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.986576194 +0000 UTC m=+966.701629582" lastFinishedPulling="2025-10-06 08:57:47.538217719 +0000 UTC m=+999.253271117" observedRunningTime="2025-10-06 08:57:48.03984519 +0000 UTC m=+999.754898578" watchObservedRunningTime="2025-10-06 08:57:48.042206363 +0000 UTC m=+999.757259771" Oct 06 08:57:48 crc kubenswrapper[4610]: I1006 08:57:48.055392 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" podStartSLOduration=3.05659088 podStartE2EDuration="36.055375556s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.708091462 +0000 UTC m=+966.423144850" lastFinishedPulling="2025-10-06 08:57:47.706876138 +0000 UTC m=+999.421929526" observedRunningTime="2025-10-06 08:57:48.053715851 +0000 UTC m=+999.768769239" watchObservedRunningTime="2025-10-06 08:57:48.055375556 +0000 UTC m=+999.770428954" Oct 06 08:57:49 crc kubenswrapper[4610]: I1006 08:57:49.025453 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" event={"ID":"e8d37aed-cf46-47a0-a8ea-cfec57404966","Type":"ContainerStarted","Data":"51692a5a6642f588b6c4e96844cc18c1db1bb3774090a016c620b679ea119ce5"} Oct 06 08:57:49 crc kubenswrapper[4610]: I1006 08:57:49.026073 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:57:49 crc kubenswrapper[4610]: I1006 08:57:49.027525 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" event={"ID":"ce2175a4-fac2-4259-91c9-6857fabd2755","Type":"ContainerStarted","Data":"94b82600f629967bbbd6c8457079d9e58a3fa59f2b57690f70f065477cda7f80"} Oct 06 08:57:49 crc kubenswrapper[4610]: I1006 08:57:49.044003 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" podStartSLOduration=2.272704026 podStartE2EDuration="37.043985045s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.052779273 +0000 UTC m=+965.767832661" lastFinishedPulling="2025-10-06 08:57:48.824060292 +0000 UTC m=+1000.539113680" observedRunningTime="2025-10-06 08:57:49.038821987 +0000 UTC m=+1000.753875375" watchObservedRunningTime="2025-10-06 08:57:49.043985045 +0000 UTC m=+1000.759038443" Oct 06 08:57:49 crc kubenswrapper[4610]: I1006 08:57:49.056086 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" podStartSLOduration=3.477709824 podStartE2EDuration="37.056068739s" podCreationTimestamp="2025-10-06 08:57:12 +0000 UTC" firstStartedPulling="2025-10-06 08:57:14.990716985 +0000 UTC m=+966.705770373" lastFinishedPulling="2025-10-06 08:57:48.5690759 +0000 UTC m=+1000.284129288" observedRunningTime="2025-10-06 08:57:49.052198545 +0000 UTC m=+1000.767251953" watchObservedRunningTime="2025-10-06 08:57:49.056068739 +0000 UTC m=+1000.771122127" Oct 06 08:57:53 crc kubenswrapper[4610]: I1006 08:57:53.236188 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-z448l" Oct 06 08:57:53 crc kubenswrapper[4610]: I1006 08:57:53.490788 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-5nhb8" Oct 06 08:57:53 crc kubenswrapper[4610]: I1006 08:57:53.601697 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-j9k2v" Oct 06 08:57:53 crc kubenswrapper[4610]: I1006 08:57:53.661021 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:57:53 crc kubenswrapper[4610]: I1006 08:57:53.665955 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-x4rkd" Oct 06 08:58:02 crc kubenswrapper[4610]: I1006 08:58:02.866290 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-gcbb8" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.003464 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.006225 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.009311 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.009603 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.010654 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bt4gh" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.010899 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.016762 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.101216 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.102675 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.110223 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.122982 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.168417 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.168722 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zff\" (UniqueName: \"kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.269821 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zff\" (UniqueName: \"kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.270116 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.270145 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fld5p\" (UniqueName: \"kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.270200 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.270230 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.271012 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.300130 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zff\" (UniqueName: \"kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff\") pod \"dnsmasq-dns-675f4bcbfc-g4b22\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.341349 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.374724 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.374791 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fld5p\" (UniqueName: \"kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.374855 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.375619 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.376130 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.412250 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fld5p\" (UniqueName: \"kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p\") pod \"dnsmasq-dns-78dd6ddcc-7czwz\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.418940 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.814925 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:20 crc kubenswrapper[4610]: I1006 08:58:20.967793 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:20 crc kubenswrapper[4610]: W1006 08:58:20.972204 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10da10cd_5b38_421d_9965_0c599b6af564.slice/crio-7c25a0e798c6f772ce3775036679ee7cca2853bb7bf8f33b7c0670d2f36f4ce6 WatchSource:0}: Error finding container 7c25a0e798c6f772ce3775036679ee7cca2853bb7bf8f33b7c0670d2f36f4ce6: Status 404 returned error can't find the container with id 7c25a0e798c6f772ce3775036679ee7cca2853bb7bf8f33b7c0670d2f36f4ce6 Oct 06 08:58:21 crc kubenswrapper[4610]: I1006 08:58:21.296452 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" event={"ID":"10da10cd-5b38-421d-9965-0c599b6af564","Type":"ContainerStarted","Data":"7c25a0e798c6f772ce3775036679ee7cca2853bb7bf8f33b7c0670d2f36f4ce6"} Oct 06 08:58:21 crc kubenswrapper[4610]: I1006 08:58:21.297990 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" event={"ID":"93e9379c-00e0-400e-92f7-aa65fea2a922","Type":"ContainerStarted","Data":"d19ca016c9d96a57b6eac09277296cf07b842d34b0981b203473ade2d55877b3"} Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.158612 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.192374 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.193778 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.220688 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.320899 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.320993 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8cq\" (UniqueName: \"kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.321037 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.425357 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.425448 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8cq\" (UniqueName: \"kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.425481 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.426295 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.426734 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.472505 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8cq\" (UniqueName: \"kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq\") pod \"dnsmasq-dns-666b6646f7-p9qs6\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.514143 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.525030 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.553091 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.554775 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.575880 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.629986 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.630325 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.630480 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9wg\" (UniqueName: \"kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.732192 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.732257 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9wg\" (UniqueName: \"kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.732308 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.733126 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.733610 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.763138 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9wg\" (UniqueName: \"kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg\") pod \"dnsmasq-dns-57d769cc4f-gz94m\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:23 crc kubenswrapper[4610]: I1006 08:58:23.950165 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.161807 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.355408 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.356843 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" event={"ID":"f9ec08c0-5b56-43e4-b38f-f3097ce870b7","Type":"ContainerStarted","Data":"92ae941a68ec075f3bf01679fa9008be0eea45aca1511e753cd16122aab0064e"} Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.356967 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.360665 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.360801 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.360877 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-th5dt" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.360971 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.361251 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.366001 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.377276 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.386194 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.437264 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:58:24 crc kubenswrapper[4610]: W1006 08:58:24.446350 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3b4804_8d1c_4a2d_84cc_d7a2d4dd7a85.slice/crio-80fdd003f1b6f37c066a86f02de40a4613bb4ebe5836a46e3e583f5aa674f0dc WatchSource:0}: Error finding container 80fdd003f1b6f37c066a86f02de40a4613bb4ebe5836a46e3e583f5aa674f0dc: Status 404 returned error can't find the container with id 80fdd003f1b6f37c066a86f02de40a4613bb4ebe5836a46e3e583f5aa674f0dc Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452201 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452346 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452428 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452504 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452573 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452640 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.452761 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blwpn\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.453850 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.453922 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.454067 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.454172 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555835 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blwpn\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555891 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555912 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555938 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555955 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.555982 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556013 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556031 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556077 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556097 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556117 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556438 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.556532 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.557267 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.557702 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.558268 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.558623 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.563498 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.563906 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.566772 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.576686 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blwpn\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.578460 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.582025 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.685336 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.727562 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.728907 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.737572 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.737725 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.737734 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.737856 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.738004 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.738109 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bwq8w" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.738566 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.746614 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861310 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861420 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861453 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861486 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861517 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861548 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861567 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb4s7\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861595 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861630 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861655 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.861681 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963078 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963144 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963185 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963216 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb4s7\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963245 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963275 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963311 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963343 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963374 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963444 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.963474 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.964357 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.965461 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.966028 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.966304 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.966523 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.967647 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.972747 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.972738 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.973177 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.974355 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.981914 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb4s7\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:24 crc kubenswrapper[4610]: I1006 08:58:24.995957 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:25 crc kubenswrapper[4610]: I1006 08:58:25.103941 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:58:25 crc kubenswrapper[4610]: I1006 08:58:25.172562 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:58:25 crc kubenswrapper[4610]: W1006 08:58:25.176083 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2adc9dee_eebc_4fec_9af7_ecdcbf1136f3.slice/crio-a45e3ab8c4affcccd5400ecdc14efafd3cc9b133afd200e04274d5d987830163 WatchSource:0}: Error finding container a45e3ab8c4affcccd5400ecdc14efafd3cc9b133afd200e04274d5d987830163: Status 404 returned error can't find the container with id a45e3ab8c4affcccd5400ecdc14efafd3cc9b133afd200e04274d5d987830163 Oct 06 08:58:25 crc kubenswrapper[4610]: I1006 08:58:25.371188 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" event={"ID":"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85","Type":"ContainerStarted","Data":"80fdd003f1b6f37c066a86f02de40a4613bb4ebe5836a46e3e583f5aa674f0dc"} Oct 06 08:58:25 crc kubenswrapper[4610]: I1006 08:58:25.372892 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerStarted","Data":"a45e3ab8c4affcccd5400ecdc14efafd3cc9b133afd200e04274d5d987830163"} Oct 06 08:58:25 crc kubenswrapper[4610]: I1006 08:58:25.375585 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:58:25 crc kubenswrapper[4610]: W1006 08:58:25.383620 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod764e6cbc_bf6c_4120_9e38_cf70e046dcf8.slice/crio-c30d30d4885cdcfa10f02d477d64007b9dd5ad09557d6801298e0618d8f91d9c WatchSource:0}: Error finding container c30d30d4885cdcfa10f02d477d64007b9dd5ad09557d6801298e0618d8f91d9c: Status 404 returned error can't find the container with id c30d30d4885cdcfa10f02d477d64007b9dd5ad09557d6801298e0618d8f91d9c Oct 06 08:58:26 crc kubenswrapper[4610]: I1006 08:58:26.397596 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerStarted","Data":"c30d30d4885cdcfa10f02d477d64007b9dd5ad09557d6801298e0618d8f91d9c"} Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.118498 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.120327 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.123114 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.123601 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hrpdt" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.123728 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.123875 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.125773 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.132813 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.153199 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.215962 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-kolla-config\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216244 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-secrets\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216442 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216554 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzhx\" (UniqueName: \"kubernetes.io/projected/21951fd5-4bf8-4851-b82f-874f75967f7c-kube-api-access-4xzhx\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216630 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216709 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216781 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216860 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.216944 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-default\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.317962 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-kolla-config\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318004 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-secrets\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318076 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318104 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzhx\" (UniqueName: \"kubernetes.io/projected/21951fd5-4bf8-4851-b82f-874f75967f7c-kube-api-access-4xzhx\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318144 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318163 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318182 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318232 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318255 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-default\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318524 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.318794 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.319018 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-config-data-default\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.319729 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.320653 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21951fd5-4bf8-4851-b82f-874f75967f7c-kolla-config\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.325622 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-secrets\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.325819 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.327196 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21951fd5-4bf8-4851-b82f-874f75967f7c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.336765 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.339109 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzhx\" (UniqueName: \"kubernetes.io/projected/21951fd5-4bf8-4851-b82f-874f75967f7c-kube-api-access-4xzhx\") pod \"openstack-galera-0\" (UID: \"21951fd5-4bf8-4851-b82f-874f75967f7c\") " pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.445843 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.494721 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.496102 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.502371 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hjczg" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.502589 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.502704 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.502946 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.602598 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628377 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628422 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628440 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628470 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628495 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628523 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628542 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628564 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98clt\" (UniqueName: \"kubernetes.io/projected/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kube-api-access-98clt\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.628590 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729595 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729640 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729668 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98clt\" (UniqueName: \"kubernetes.io/projected/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kube-api-access-98clt\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729698 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729759 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729779 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729792 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729818 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.729840 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.730296 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.730377 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.730610 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.731104 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.731443 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.748339 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.754342 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98clt\" (UniqueName: \"kubernetes.io/projected/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-kube-api-access-98clt\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.754442 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.754910 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6faa6b22-87fb-46cf-93cf-0848f9f7ce06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.760679 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6faa6b22-87fb-46cf-93cf-0848f9f7ce06\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:27 crc kubenswrapper[4610]: I1006 08:58:27.819150 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.073873 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.074893 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.080255 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.080501 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g8g8v" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.080715 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.087733 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.145487 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.145788 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.145944 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-config-data\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.145994 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h68c\" (UniqueName: \"kubernetes.io/projected/ef8e8806-0063-480d-933b-5a6c760d503e-kube-api-access-6h68c\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.146155 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-kolla-config\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.247745 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-kolla-config\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.247782 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.247824 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.247867 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-config-data\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.247887 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h68c\" (UniqueName: \"kubernetes.io/projected/ef8e8806-0063-480d-933b-5a6c760d503e-kube-api-access-6h68c\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.248675 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-kolla-config\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.248760 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef8e8806-0063-480d-933b-5a6c760d503e-config-data\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.251368 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.251934 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e8806-0063-480d-933b-5a6c760d503e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.267857 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h68c\" (UniqueName: \"kubernetes.io/projected/ef8e8806-0063-480d-933b-5a6c760d503e-kube-api-access-6h68c\") pod \"memcached-0\" (UID: \"ef8e8806-0063-480d-933b-5a6c760d503e\") " pod="openstack/memcached-0" Oct 06 08:58:28 crc kubenswrapper[4610]: I1006 08:58:28.399578 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:58:29 crc kubenswrapper[4610]: I1006 08:58:29.874528 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:58:29 crc kubenswrapper[4610]: I1006 08:58:29.875773 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:58:29 crc kubenswrapper[4610]: I1006 08:58:29.878087 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4hzrn" Oct 06 08:58:29 crc kubenswrapper[4610]: I1006 08:58:29.884932 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:58:30 crc kubenswrapper[4610]: I1006 08:58:30.004271 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22smf\" (UniqueName: \"kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf\") pod \"kube-state-metrics-0\" (UID: \"cf575405-4778-47c1-b0c1-b1a51c9936d1\") " pod="openstack/kube-state-metrics-0" Oct 06 08:58:30 crc kubenswrapper[4610]: I1006 08:58:30.106033 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22smf\" (UniqueName: \"kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf\") pod \"kube-state-metrics-0\" (UID: \"cf575405-4778-47c1-b0c1-b1a51c9936d1\") " pod="openstack/kube-state-metrics-0" Oct 06 08:58:30 crc kubenswrapper[4610]: I1006 08:58:30.148191 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22smf\" (UniqueName: \"kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf\") pod \"kube-state-metrics-0\" (UID: \"cf575405-4778-47c1-b0c1-b1a51c9936d1\") " pod="openstack/kube-state-metrics-0" Oct 06 08:58:30 crc kubenswrapper[4610]: I1006 08:58:30.197032 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.754662 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.757932 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.760182 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.760472 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.760486 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.760482 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-h6vd9" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.760576 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.773391 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.798707 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6hjff"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.800020 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.811110 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.811395 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.811621 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n7tgj" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.822082 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pfhq5"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.825055 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.848728 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.870978 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pfhq5"] Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884180 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884248 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884277 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-log-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884308 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8s4l\" (UniqueName: \"kubernetes.io/projected/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-kube-api-access-s8s4l\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884332 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884379 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8fl\" (UniqueName: \"kubernetes.io/projected/1e77ce11-f629-48ab-820e-e67fbfc3ba57-kube-api-access-ss8fl\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884419 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-combined-ca-bundle\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884444 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884479 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884501 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884531 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e77ce11-f629-48ab-820e-e67fbfc3ba57-scripts\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884556 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884586 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884613 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-ovn-controller-tls-certs\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.884641 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986633 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986691 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-log-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986727 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8s4l\" (UniqueName: \"kubernetes.io/projected/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-kube-api-access-s8s4l\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986753 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986793 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-etc-ovs\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986828 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-lib\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986859 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss8fl\" (UniqueName: \"kubernetes.io/projected/1e77ce11-f629-48ab-820e-e67fbfc3ba57-kube-api-access-ss8fl\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986882 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-run\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986924 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-combined-ca-bundle\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986952 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.986991 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987013 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987059 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e77ce11-f629-48ab-820e-e67fbfc3ba57-scripts\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987222 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478db756-12b3-40f7-b49c-49a548bdf337-scripts\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987239 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-log-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987251 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987306 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987339 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-ovn-controller-tls-certs\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987371 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfrh\" (UniqueName: \"kubernetes.io/projected/478db756-12b3-40f7-b49c-49a548bdf337-kube-api-access-hxfrh\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987395 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987415 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-log\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987469 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.987783 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run-ovn\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.989838 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e77ce11-f629-48ab-820e-e67fbfc3ba57-scripts\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.989966 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e77ce11-f629-48ab-820e-e67fbfc3ba57-var-run\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.990248 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.991824 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.993235 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.993810 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:34 crc kubenswrapper[4610]: I1006 08:58:34.999555 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-combined-ca-bundle\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.000864 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.013553 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.014114 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss8fl\" (UniqueName: \"kubernetes.io/projected/1e77ce11-f629-48ab-820e-e67fbfc3ba57-kube-api-access-ss8fl\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.014801 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8s4l\" (UniqueName: \"kubernetes.io/projected/ea778a76-1f2e-4289-8b2f-7ccc1975eb3d-kube-api-access-s8s4l\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.015697 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e77ce11-f629-48ab-820e-e67fbfc3ba57-ovn-controller-tls-certs\") pod \"ovn-controller-6hjff\" (UID: \"1e77ce11-f629-48ab-820e-e67fbfc3ba57\") " pod="openstack/ovn-controller-6hjff" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.023321 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.089571 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478db756-12b3-40f7-b49c-49a548bdf337-scripts\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.089701 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfrh\" (UniqueName: \"kubernetes.io/projected/478db756-12b3-40f7-b49c-49a548bdf337-kube-api-access-hxfrh\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.089735 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-log\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.089877 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-etc-ovs\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.089911 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-lib\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.090004 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-run\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.090005 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-log\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.090204 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-run\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.090294 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-etc-ovs\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.091481 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478db756-12b3-40f7-b49c-49a548bdf337-scripts\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.091682 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/478db756-12b3-40f7-b49c-49a548bdf337-var-lib\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.105376 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfrh\" (UniqueName: \"kubernetes.io/projected/478db756-12b3-40f7-b49c-49a548bdf337-kube-api-access-hxfrh\") pod \"ovn-controller-ovs-pfhq5\" (UID: \"478db756-12b3-40f7-b49c-49a548bdf337\") " pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.128356 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.141985 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff" Oct 06 08:58:35 crc kubenswrapper[4610]: I1006 08:58:35.148820 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.883928 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.885505 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.888828 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.888890 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dr5b5" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.889015 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.889106 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 08:58:36 crc kubenswrapper[4610]: I1006 08:58:36.903672 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.026747 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.026804 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.026828 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.026866 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2dn\" (UniqueName: \"kubernetes.io/projected/f51717ef-7ac5-45b1-ae7c-beddba660645-kube-api-access-mn2dn\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.026904 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-config\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.027137 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.027221 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.027268 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128435 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128490 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128522 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128565 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2dn\" (UniqueName: \"kubernetes.io/projected/f51717ef-7ac5-45b1-ae7c-beddba660645-kube-api-access-mn2dn\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128607 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-config\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128653 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.128702 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.129102 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.129113 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.129763 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.131836 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51717ef-7ac5-45b1-ae7c-beddba660645-config\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.134254 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.135363 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.136581 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51717ef-7ac5-45b1-ae7c-beddba660645-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.144433 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2dn\" (UniqueName: \"kubernetes.io/projected/f51717ef-7ac5-45b1-ae7c-beddba660645-kube-api-access-mn2dn\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.147591 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f51717ef-7ac5-45b1-ae7c-beddba660645\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:37 crc kubenswrapper[4610]: I1006 08:58:37.220265 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:58:53 crc kubenswrapper[4610]: E1006 08:58:53.056635 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:58:53 crc kubenswrapper[4610]: E1006 08:58:53.057277 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fld5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7czwz_openstack(93e9379c-00e0-400e-92f7-aa65fea2a922): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:53 crc kubenswrapper[4610]: E1006 08:58:53.059482 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" podUID="93e9379c-00e0-400e-92f7-aa65fea2a922" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.231842 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.232342 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cs9wg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gz94m_openstack(0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.233768 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.239562 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.239767 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gb4s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(764e6cbc-bf6c-4120-9e38-cf70e046dcf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.240866 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.243908 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.244029 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7h8cq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-p9qs6_openstack(f9ec08c0-5b56-43e4-b38f-f3097ce870b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.245160 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.293954 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.294471 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8zff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-g4b22_openstack(10da10cd-5b38-421d-9965-0c599b6af564): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.295895 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" podUID="10da10cd-5b38-421d-9965-0c599b6af564" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.333264 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.333431 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-blwpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(2adc9dee-eebc-4fec-9af7-ecdcbf1136f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.335119 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.449154 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.561315 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config\") pod \"93e9379c-00e0-400e-92f7-aa65fea2a922\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.561441 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc\") pod \"93e9379c-00e0-400e-92f7-aa65fea2a922\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.561514 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fld5p\" (UniqueName: \"kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p\") pod \"93e9379c-00e0-400e-92f7-aa65fea2a922\" (UID: \"93e9379c-00e0-400e-92f7-aa65fea2a922\") " Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.562165 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config" (OuterVolumeSpecName: "config") pod "93e9379c-00e0-400e-92f7-aa65fea2a922" (UID: "93e9379c-00e0-400e-92f7-aa65fea2a922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.562152 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93e9379c-00e0-400e-92f7-aa65fea2a922" (UID: "93e9379c-00e0-400e-92f7-aa65fea2a922"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.568727 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p" (OuterVolumeSpecName: "kube-api-access-fld5p") pod "93e9379c-00e0-400e-92f7-aa65fea2a922" (UID: "93e9379c-00e0-400e-92f7-aa65fea2a922"). InnerVolumeSpecName "kube-api-access-fld5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.633285 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.633342 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7czwz" event={"ID":"93e9379c-00e0-400e-92f7-aa65fea2a922","Type":"ContainerDied","Data":"d19ca016c9d96a57b6eac09277296cf07b842d34b0981b203473ade2d55877b3"} Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.634934 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.636306 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.636338 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" Oct 06 08:58:54 crc kubenswrapper[4610]: E1006 08:58:54.637019 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.663950 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fld5p\" (UniqueName: \"kubernetes.io/projected/93e9379c-00e0-400e-92f7-aa65fea2a922-kube-api-access-fld5p\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.664198 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.664211 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e9379c-00e0-400e-92f7-aa65fea2a922-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.763252 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.806505 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.811748 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7czwz"] Oct 06 08:58:54 crc kubenswrapper[4610]: I1006 08:58:54.820117 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:58:54 crc kubenswrapper[4610]: W1006 08:58:54.825322 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef8e8806_0063_480d_933b_5a6c760d503e.slice/crio-6423e7c37ba499b36c3883f27952722b842af83ff2213367a647d28b265aa522 WatchSource:0}: Error finding container 6423e7c37ba499b36c3883f27952722b842af83ff2213367a647d28b265aa522: Status 404 returned error can't find the container with id 6423e7c37ba499b36c3883f27952722b842af83ff2213367a647d28b265aa522 Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.079760 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e9379c-00e0-400e-92f7-aa65fea2a922" path="/var/lib/kubelet/pods/93e9379c-00e0-400e-92f7-aa65fea2a922/volumes" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.083374 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.141289 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.153961 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff"] Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.164412 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:58:55 crc kubenswrapper[4610]: W1006 08:58:55.165799 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6faa6b22_87fb_46cf_93cf_0848f9f7ce06.slice/crio-a526938401dde9b3515338288fd38d01787c6eb68604433793976f363553d9a8 WatchSource:0}: Error finding container a526938401dde9b3515338288fd38d01787c6eb68604433793976f363553d9a8: Status 404 returned error can't find the container with id a526938401dde9b3515338288fd38d01787c6eb68604433793976f363553d9a8 Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.175283 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config\") pod \"10da10cd-5b38-421d-9965-0c599b6af564\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.175470 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zff\" (UniqueName: \"kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff\") pod \"10da10cd-5b38-421d-9965-0c599b6af564\" (UID: \"10da10cd-5b38-421d-9965-0c599b6af564\") " Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.175693 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config" (OuterVolumeSpecName: "config") pod "10da10cd-5b38-421d-9965-0c599b6af564" (UID: "10da10cd-5b38-421d-9965-0c599b6af564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.175825 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10da10cd-5b38-421d-9965-0c599b6af564-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.180115 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff" (OuterVolumeSpecName: "kube-api-access-g8zff") pod "10da10cd-5b38-421d-9965-0c599b6af564" (UID: "10da10cd-5b38-421d-9965-0c599b6af564"). InnerVolumeSpecName "kube-api-access-g8zff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.277572 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zff\" (UniqueName: \"kubernetes.io/projected/10da10cd-5b38-421d-9965-0c599b6af564-kube-api-access-g8zff\") on node \"crc\" DevicePath \"\"" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.645082 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ef8e8806-0063-480d-933b-5a6c760d503e","Type":"ContainerStarted","Data":"6423e7c37ba499b36c3883f27952722b842af83ff2213367a647d28b265aa522"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.647596 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6faa6b22-87fb-46cf-93cf-0848f9f7ce06","Type":"ContainerStarted","Data":"a526938401dde9b3515338288fd38d01787c6eb68604433793976f363553d9a8"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.649186 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff" event={"ID":"1e77ce11-f629-48ab-820e-e67fbfc3ba57","Type":"ContainerStarted","Data":"ce49d36be5abe2f8b7550350b94a75bace490d5ff762e3b8af5280f37d4a2023"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.650337 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" event={"ID":"10da10cd-5b38-421d-9965-0c599b6af564","Type":"ContainerDied","Data":"7c25a0e798c6f772ce3775036679ee7cca2853bb7bf8f33b7c0670d2f36f4ce6"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.650486 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g4b22" Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.658462 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cf575405-4778-47c1-b0c1-b1a51c9936d1","Type":"ContainerStarted","Data":"873b27a304dab5994e9a265e53cb068755436e385657fc7519caa409daf8186c"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.660507 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"21951fd5-4bf8-4851-b82f-874f75967f7c","Type":"ContainerStarted","Data":"7842de284730ada01648b78451bea5065f62769a2178e3b3db22ea65e24ce13d"} Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.729996 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.737107 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g4b22"] Oct 06 08:58:55 crc kubenswrapper[4610]: I1006 08:58:55.864757 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pfhq5"] Oct 06 08:58:55 crc kubenswrapper[4610]: W1006 08:58:55.909528 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478db756_12b3_40f7_b49c_49a548bdf337.slice/crio-b81a19bfdd6193e3e882fadd1cdd397f4ca77fb16a1ff92662e5865daba569d2 WatchSource:0}: Error finding container b81a19bfdd6193e3e882fadd1cdd397f4ca77fb16a1ff92662e5865daba569d2: Status 404 returned error can't find the container with id b81a19bfdd6193e3e882fadd1cdd397f4ca77fb16a1ff92662e5865daba569d2 Oct 06 08:58:56 crc kubenswrapper[4610]: I1006 08:58:56.668261 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pfhq5" event={"ID":"478db756-12b3-40f7-b49c-49a548bdf337","Type":"ContainerStarted","Data":"b81a19bfdd6193e3e882fadd1cdd397f4ca77fb16a1ff92662e5865daba569d2"} Oct 06 08:58:56 crc kubenswrapper[4610]: I1006 08:58:56.715355 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:58:56 crc kubenswrapper[4610]: I1006 08:58:56.867867 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:58:57 crc kubenswrapper[4610]: I1006 08:58:57.080207 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10da10cd-5b38-421d-9965-0c599b6af564" path="/var/lib/kubelet/pods/10da10cd-5b38-421d-9965-0c599b6af564/volumes" Oct 06 08:58:57 crc kubenswrapper[4610]: I1006 08:58:57.678345 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f51717ef-7ac5-45b1-ae7c-beddba660645","Type":"ContainerStarted","Data":"54d3e524d7f8d01f88b299f2f43fe56d9274c835ef41a8b30cf64173877ec51b"} Oct 06 08:58:57 crc kubenswrapper[4610]: I1006 08:58:57.680505 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d","Type":"ContainerStarted","Data":"34155bfb92cb893f55062f41a121303c20b29088115362483dac4adf7c747cec"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.736945 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d","Type":"ContainerStarted","Data":"1133f1e0f1a96d0560b4d462f6880a320117cfdbc2441b0e97754770184338d6"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.738614 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff" event={"ID":"1e77ce11-f629-48ab-820e-e67fbfc3ba57","Type":"ContainerStarted","Data":"5d81b881d507ed273bc6257daf49d5cc09cda69814dac0cb2747d311766b5960"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.740621 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6hjff" Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.743577 4610 generic.go:334] "Generic (PLEG): container finished" podID="478db756-12b3-40f7-b49c-49a548bdf337" containerID="1e1f9c4ce3dee29a7d4eb2155100f7c971598744cacf2eb93b53c8f5427052b0" exitCode=0 Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.743687 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pfhq5" event={"ID":"478db756-12b3-40f7-b49c-49a548bdf337","Type":"ContainerDied","Data":"1e1f9c4ce3dee29a7d4eb2155100f7c971598744cacf2eb93b53c8f5427052b0"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.750479 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f51717ef-7ac5-45b1-ae7c-beddba660645","Type":"ContainerStarted","Data":"c705584c78b6ca7d603a51914bdef1d24dded0b33f9790a4de97f8774acaf87a"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.752794 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cf575405-4778-47c1-b0c1-b1a51c9936d1","Type":"ContainerStarted","Data":"d4f73b9bbf53dbd41bf1dde3d799a21f990a36bb7fd5b5be1259a74dc8c1f380"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.753163 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.772864 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"21951fd5-4bf8-4851-b82f-874f75967f7c","Type":"ContainerStarted","Data":"0306acf7508dcaf6e73cca84d2ff490752ed504c9aa6319892d3bebc575db59e"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.775192 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ef8e8806-0063-480d-933b-5a6c760d503e","Type":"ContainerStarted","Data":"042ca676d580496461ef87a8684220a1b4a69e8e13e09e6810edc6b8b55c969d"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.776381 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.776679 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6hjff" podStartSLOduration=22.664600653 podStartE2EDuration="29.776663291s" podCreationTimestamp="2025-10-06 08:58:34 +0000 UTC" firstStartedPulling="2025-10-06 08:58:55.149654592 +0000 UTC m=+1066.864707970" lastFinishedPulling="2025-10-06 08:59:02.26171723 +0000 UTC m=+1073.976770608" observedRunningTime="2025-10-06 08:59:03.770733043 +0000 UTC m=+1075.485786431" watchObservedRunningTime="2025-10-06 08:59:03.776663291 +0000 UTC m=+1075.491716679" Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.780192 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6faa6b22-87fb-46cf-93cf-0848f9f7ce06","Type":"ContainerStarted","Data":"6d6e8cac3f6ef244f346f267435117072a4c724912442133b31a3e5f8fba2e87"} Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.787510 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=27.235138526 podStartE2EDuration="34.787491389s" podCreationTimestamp="2025-10-06 08:58:29 +0000 UTC" firstStartedPulling="2025-10-06 08:58:55.143666173 +0000 UTC m=+1066.858719561" lastFinishedPulling="2025-10-06 08:59:02.696019026 +0000 UTC m=+1074.411072424" observedRunningTime="2025-10-06 08:59:03.78491054 +0000 UTC m=+1075.499963928" watchObservedRunningTime="2025-10-06 08:59:03.787491389 +0000 UTC m=+1075.502544777" Oct 06 08:59:03 crc kubenswrapper[4610]: I1006 08:59:03.880723 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=30.891596689 podStartE2EDuration="35.880705234s" podCreationTimestamp="2025-10-06 08:58:28 +0000 UTC" firstStartedPulling="2025-10-06 08:58:54.828311126 +0000 UTC m=+1066.543364514" lastFinishedPulling="2025-10-06 08:58:59.817419671 +0000 UTC m=+1071.532473059" observedRunningTime="2025-10-06 08:59:03.878747773 +0000 UTC m=+1075.593801181" watchObservedRunningTime="2025-10-06 08:59:03.880705234 +0000 UTC m=+1075.595758622" Oct 06 08:59:04 crc kubenswrapper[4610]: I1006 08:59:04.789982 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pfhq5" event={"ID":"478db756-12b3-40f7-b49c-49a548bdf337","Type":"ContainerStarted","Data":"03c66a657c3b2d25203fe3e4a54d7e947654e7d6bbfb6266f0ace374cab3f259"} Oct 06 08:59:04 crc kubenswrapper[4610]: I1006 08:59:04.790255 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pfhq5" event={"ID":"478db756-12b3-40f7-b49c-49a548bdf337","Type":"ContainerStarted","Data":"61c2a08b56bf5d1cc93371bdd6caa08982b0ea0959dab0f2c8b6945919bf8aa7"} Oct 06 08:59:04 crc kubenswrapper[4610]: I1006 08:59:04.791203 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:59:04 crc kubenswrapper[4610]: I1006 08:59:04.791219 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:59:04 crc kubenswrapper[4610]: I1006 08:59:04.817085 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pfhq5" podStartSLOduration=24.562743104 podStartE2EDuration="30.817065527s" podCreationTimestamp="2025-10-06 08:58:34 +0000 UTC" firstStartedPulling="2025-10-06 08:58:55.913379229 +0000 UTC m=+1067.628432617" lastFinishedPulling="2025-10-06 08:59:02.167701652 +0000 UTC m=+1073.882755040" observedRunningTime="2025-10-06 08:59:04.816912533 +0000 UTC m=+1076.531965921" watchObservedRunningTime="2025-10-06 08:59:04.817065527 +0000 UTC m=+1076.532118925" Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.805090 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea778a76-1f2e-4289-8b2f-7ccc1975eb3d","Type":"ContainerStarted","Data":"162b2eae724b33f277fa0676104e3fc76dbfaab5eae07a9f5ca1b717b3ff82c2"} Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.806795 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f51717ef-7ac5-45b1-ae7c-beddba660645","Type":"ContainerStarted","Data":"8f32f23cab8d615c627aa7f77b84baa1062c8d9a3fb4a6525cf47a3e46d144b9"} Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.808566 4610 generic.go:334] "Generic (PLEG): container finished" podID="21951fd5-4bf8-4851-b82f-874f75967f7c" containerID="0306acf7508dcaf6e73cca84d2ff490752ed504c9aa6319892d3bebc575db59e" exitCode=0 Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.808692 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"21951fd5-4bf8-4851-b82f-874f75967f7c","Type":"ContainerDied","Data":"0306acf7508dcaf6e73cca84d2ff490752ed504c9aa6319892d3bebc575db59e"} Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.811386 4610 generic.go:334] "Generic (PLEG): container finished" podID="6faa6b22-87fb-46cf-93cf-0848f9f7ce06" containerID="6d6e8cac3f6ef244f346f267435117072a4c724912442133b31a3e5f8fba2e87" exitCode=0 Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.811677 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6faa6b22-87fb-46cf-93cf-0848f9f7ce06","Type":"ContainerDied","Data":"6d6e8cac3f6ef244f346f267435117072a4c724912442133b31a3e5f8fba2e87"} Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.834725 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.02428035 podStartE2EDuration="33.834709153s" podCreationTimestamp="2025-10-06 08:58:33 +0000 UTC" firstStartedPulling="2025-10-06 08:58:57.224121585 +0000 UTC m=+1068.939174973" lastFinishedPulling="2025-10-06 08:59:06.034550388 +0000 UTC m=+1077.749603776" observedRunningTime="2025-10-06 08:59:06.832697779 +0000 UTC m=+1078.547751177" watchObservedRunningTime="2025-10-06 08:59:06.834709153 +0000 UTC m=+1078.549762541" Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.915159 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=23.090555969 podStartE2EDuration="31.915140968s" podCreationTimestamp="2025-10-06 08:58:35 +0000 UTC" firstStartedPulling="2025-10-06 08:58:57.222341478 +0000 UTC m=+1068.937394876" lastFinishedPulling="2025-10-06 08:59:06.046926487 +0000 UTC m=+1077.761979875" observedRunningTime="2025-10-06 08:59:06.913813303 +0000 UTC m=+1078.628866731" watchObservedRunningTime="2025-10-06 08:59:06.915140968 +0000 UTC m=+1078.630194356" Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.988969 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-62bzc"] Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.989886 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:06 crc kubenswrapper[4610]: I1006 08:59:06.997903 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.010564 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-62bzc"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094039 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovs-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094152 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094200 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovn-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094244 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfae3507-92ed-4d33-85d2-b5a0c3beed93-config\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094271 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8v5\" (UniqueName: \"kubernetes.io/projected/cfae3507-92ed-4d33-85d2-b5a0c3beed93-kube-api-access-4h8v5\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.094292 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-combined-ca-bundle\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.195527 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfae3507-92ed-4d33-85d2-b5a0c3beed93-config\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.195817 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8v5\" (UniqueName: \"kubernetes.io/projected/cfae3507-92ed-4d33-85d2-b5a0c3beed93-kube-api-access-4h8v5\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.195836 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-combined-ca-bundle\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.195894 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovs-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.195962 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.196022 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovn-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.197162 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfae3507-92ed-4d33-85d2-b5a0c3beed93-config\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.201323 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-combined-ca-bundle\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.202172 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovn-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.202649 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfae3507-92ed-4d33-85d2-b5a0c3beed93-ovs-rundir\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.207637 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfae3507-92ed-4d33-85d2-b5a0c3beed93-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.222511 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.222600 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.226416 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8v5\" (UniqueName: \"kubernetes.io/projected/cfae3507-92ed-4d33-85d2-b5a0c3beed93-kube-api-access-4h8v5\") pod \"ovn-controller-metrics-62bzc\" (UID: \"cfae3507-92ed-4d33-85d2-b5a0c3beed93\") " pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.275501 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.341776 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-62bzc" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.387500 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.440729 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.445425 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.449351 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.449622 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.617165 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwh9p\" (UniqueName: \"kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.617511 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.617586 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.617662 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.687749 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.721030 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.721107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwh9p\" (UniqueName: \"kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.721145 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.721220 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.722293 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.722433 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.723102 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.771533 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwh9p\" (UniqueName: \"kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p\") pod \"dnsmasq-dns-6bc7876d45-fljkv\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.776647 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.779171 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.782472 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.796768 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-62bzc"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.811907 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.841430 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerStarted","Data":"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.844398 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-62bzc" event={"ID":"cfae3507-92ed-4d33-85d2-b5a0c3beed93","Type":"ContainerStarted","Data":"a56389d1d6ce672bd854dcfe4ceeddf8acd380d7ee32447dc83396e285e98981"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.846612 4610 generic.go:334] "Generic (PLEG): container finished" podID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" containerID="9d4686b282ba3e8937f76561209be94a605a39add74e0818652b54f1ea2939df" exitCode=0 Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.846957 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" event={"ID":"f9ec08c0-5b56-43e4-b38f-f3097ce870b7","Type":"ContainerDied","Data":"9d4686b282ba3e8937f76561209be94a605a39add74e0818652b54f1ea2939df"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.849159 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6faa6b22-87fb-46cf-93cf-0848f9f7ce06","Type":"ContainerStarted","Data":"42896cc29464d7c40f21ac029c7b605356c9b72a8cf113cf05d2bbbf3a961227"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.851681 4610 generic.go:334] "Generic (PLEG): container finished" podID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" containerID="5f94379f88e69b6a66099260a16353848873b072389a3cb576c74fa8ffab5ec5" exitCode=0 Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.851865 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" event={"ID":"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85","Type":"ContainerDied","Data":"5f94379f88e69b6a66099260a16353848873b072389a3cb576c74fa8ffab5ec5"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.858074 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"21951fd5-4bf8-4851-b82f-874f75967f7c","Type":"ContainerStarted","Data":"babd92e2cdbeec7432e54c0339bfa903e1590ce42dde26be8a29d31ac57bf135"} Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.916012 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.925728 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.925824 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzjc\" (UniqueName: \"kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.925894 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.925923 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.925983 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:07 crc kubenswrapper[4610]: I1006 08:59:07.971219 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=34.603074212 podStartE2EDuration="41.971037267s" podCreationTimestamp="2025-10-06 08:58:26 +0000 UTC" firstStartedPulling="2025-10-06 08:58:54.800488327 +0000 UTC m=+1066.515541715" lastFinishedPulling="2025-10-06 08:59:02.168451382 +0000 UTC m=+1073.883504770" observedRunningTime="2025-10-06 08:59:07.96740891 +0000 UTC m=+1079.682462328" watchObservedRunningTime="2025-10-06 08:59:07.971037267 +0000 UTC m=+1079.686090665" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.029774 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.029859 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzjc\" (UniqueName: \"kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.029950 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.029982 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.030076 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.030993 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.031587 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.556699032 podStartE2EDuration="42.031575145s" podCreationTimestamp="2025-10-06 08:58:26 +0000 UTC" firstStartedPulling="2025-10-06 08:58:55.167217118 +0000 UTC m=+1066.882270506" lastFinishedPulling="2025-10-06 08:59:01.642093231 +0000 UTC m=+1073.357146619" observedRunningTime="2025-10-06 08:59:07.995157307 +0000 UTC m=+1079.710210695" watchObservedRunningTime="2025-10-06 08:59:08.031575145 +0000 UTC m=+1079.746628533" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.033158 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.033740 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.033969 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.061244 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzjc\" (UniqueName: \"kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc\") pod \"dnsmasq-dns-8554648995-fms24\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.068228 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.106065 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.136586 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.224103 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.301940 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.401740 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.442075 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h8cq\" (UniqueName: \"kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq\") pod \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.442157 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc\") pod \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.442220 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config\") pod \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\" (UID: \"f9ec08c0-5b56-43e4-b38f-f3097ce870b7\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.449624 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq" (OuterVolumeSpecName: "kube-api-access-7h8cq") pod "f9ec08c0-5b56-43e4-b38f-f3097ce870b7" (UID: "f9ec08c0-5b56-43e4-b38f-f3097ce870b7"). InnerVolumeSpecName "kube-api-access-7h8cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.476865 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config" (OuterVolumeSpecName: "config") pod "f9ec08c0-5b56-43e4-b38f-f3097ce870b7" (UID: "f9ec08c0-5b56-43e4-b38f-f3097ce870b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.488455 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9ec08c0-5b56-43e4-b38f-f3097ce870b7" (UID: "f9ec08c0-5b56-43e4-b38f-f3097ce870b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.490422 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.547502 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc\") pod \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.547600 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9wg\" (UniqueName: \"kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg\") pod \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.547654 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config\") pod \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\" (UID: \"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85\") " Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.548023 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.548057 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.548071 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h8cq\" (UniqueName: \"kubernetes.io/projected/f9ec08c0-5b56-43e4-b38f-f3097ce870b7-kube-api-access-7h8cq\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.561726 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg" (OuterVolumeSpecName: "kube-api-access-cs9wg") pod "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" (UID: "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85"). InnerVolumeSpecName "kube-api-access-cs9wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.583871 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" (UID: "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.602011 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config" (OuterVolumeSpecName: "config") pod "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" (UID: "0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.648841 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9wg\" (UniqueName: \"kubernetes.io/projected/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-kube-api-access-cs9wg\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.648875 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.648886 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.718391 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:08 crc kubenswrapper[4610]: W1006 08:59:08.722592 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea422d2b_a77f_4103_948c_d86e75a81c91.slice/crio-3622af8b1cc0e45fc63b7f9ccd6b030ca29b0e24786b5a0e2c89d70afd21a5ef WatchSource:0}: Error finding container 3622af8b1cc0e45fc63b7f9ccd6b030ca29b0e24786b5a0e2c89d70afd21a5ef: Status 404 returned error can't find the container with id 3622af8b1cc0e45fc63b7f9ccd6b030ca29b0e24786b5a0e2c89d70afd21a5ef Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.848910 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:08 crc kubenswrapper[4610]: W1006 08:59:08.856010 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode63da204_5b27_4c2f_9e93_668b8d279a19.slice/crio-53e04553a2a9408635fd7a42d4f078fd92fea09d5315fa89d72f5747a6853ea2 WatchSource:0}: Error finding container 53e04553a2a9408635fd7a42d4f078fd92fea09d5315fa89d72f5747a6853ea2: Status 404 returned error can't find the container with id 53e04553a2a9408635fd7a42d4f078fd92fea09d5315fa89d72f5747a6853ea2 Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.866230 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" event={"ID":"ea422d2b-a77f-4103-948c-d86e75a81c91","Type":"ContainerStarted","Data":"3622af8b1cc0e45fc63b7f9ccd6b030ca29b0e24786b5a0e2c89d70afd21a5ef"} Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.870162 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" event={"ID":"0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85","Type":"ContainerDied","Data":"80fdd003f1b6f37c066a86f02de40a4613bb4ebe5836a46e3e583f5aa674f0dc"} Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.870238 4610 scope.go:117] "RemoveContainer" containerID="5f94379f88e69b6a66099260a16353848873b072389a3cb576c74fa8ffab5ec5" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.870367 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gz94m" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.882955 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-62bzc" event={"ID":"cfae3507-92ed-4d33-85d2-b5a0c3beed93","Type":"ContainerStarted","Data":"970cdc4eab7ef940659a0a2d3a879b16176253b9472b97679e7eee3288829153"} Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.886075 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-fms24" event={"ID":"e63da204-5b27-4c2f-9e93-668b8d279a19","Type":"ContainerStarted","Data":"53e04553a2a9408635fd7a42d4f078fd92fea09d5315fa89d72f5747a6853ea2"} Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.890296 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" event={"ID":"f9ec08c0-5b56-43e4-b38f-f3097ce870b7","Type":"ContainerDied","Data":"92ae941a68ec075f3bf01679fa9008be0eea45aca1511e753cd16122aab0064e"} Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.890619 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p9qs6" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.890787 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.904163 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-62bzc" podStartSLOduration=2.904146553 podStartE2EDuration="2.904146553s" podCreationTimestamp="2025-10-06 08:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:08.90402241 +0000 UTC m=+1080.619075818" watchObservedRunningTime="2025-10-06 08:59:08.904146553 +0000 UTC m=+1080.619199941" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.904783 4610 scope.go:117] "RemoveContainer" containerID="9d4686b282ba3e8937f76561209be94a605a39add74e0818652b54f1ea2939df" Oct 06 08:59:08 crc kubenswrapper[4610]: I1006 08:59:08.986852 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.022900 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.045420 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gz94m"] Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.077178 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.103459 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" path="/var/lib/kubelet/pods/0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85/volumes" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.103949 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p9qs6"] Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.377504 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:59:09 crc kubenswrapper[4610]: E1006 08:59:09.377902 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.377925 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: E1006 08:59:09.377973 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.377981 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.378199 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.378218 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3b4804-8d1c-4a2d-84cc-d7a2d4dd7a85" containerName="init" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.379276 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.382695 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.382887 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.382998 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lkfrz" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.384554 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.389401 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.502256 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-config\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.502311 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.502714 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.502807 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwhm\" (UniqueName: \"kubernetes.io/projected/f726409a-ab18-426c-84e7-2d8ae473a3d4-kube-api-access-njwhm\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.503630 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.503941 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-scripts\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.504353 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605472 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605554 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-scripts\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605604 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605629 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-config\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605657 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605705 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.605725 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwhm\" (UniqueName: \"kubernetes.io/projected/f726409a-ab18-426c-84e7-2d8ae473a3d4-kube-api-access-njwhm\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.606585 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.606905 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-config\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.606912 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f726409a-ab18-426c-84e7-2d8ae473a3d4-scripts\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.610784 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.610865 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.616964 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f726409a-ab18-426c-84e7-2d8ae473a3d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.634868 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwhm\" (UniqueName: \"kubernetes.io/projected/f726409a-ab18-426c-84e7-2d8ae473a3d4-kube-api-access-njwhm\") pod \"ovn-northd-0\" (UID: \"f726409a-ab18-426c-84e7-2d8ae473a3d4\") " pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.704254 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.899124 4610 generic.go:334] "Generic (PLEG): container finished" podID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerID="1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358" exitCode=0 Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.899167 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" event={"ID":"ea422d2b-a77f-4103-948c-d86e75a81c91","Type":"ContainerDied","Data":"1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358"} Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.901188 4610 generic.go:334] "Generic (PLEG): container finished" podID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerID="8c3519d0da4c9ecb9bd7506c09388d8f3384dfad7ec1f20bd3b67f63a8802229" exitCode=0 Oct 06 08:59:09 crc kubenswrapper[4610]: I1006 08:59:09.901325 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-fms24" event={"ID":"e63da204-5b27-4c2f-9e93-668b8d279a19","Type":"ContainerDied","Data":"8c3519d0da4c9ecb9bd7506c09388d8f3384dfad7ec1f20bd3b67f63a8802229"} Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.174118 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.225288 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.483242 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.523628 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.525142 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.546655 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.632470 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrl8l\" (UniqueName: \"kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.632521 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.632546 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.632596 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.632617 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.734707 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrl8l\" (UniqueName: \"kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.734766 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.734795 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.734854 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.734884 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.735689 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.736675 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.736868 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.736991 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.760811 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrl8l\" (UniqueName: \"kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l\") pod \"dnsmasq-dns-b8fbc5445-glndl\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.840198 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:10 crc kubenswrapper[4610]: I1006 08:59:10.931618 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f726409a-ab18-426c-84e7-2d8ae473a3d4","Type":"ContainerStarted","Data":"452185ac20e065c0b0629cf7354bedf477b30d4ae2f520cf5077882b77ccb7cc"} Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.090161 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ec08c0-5b56-43e4-b38f-f3097ce870b7" path="/var/lib/kubelet/pods/f9ec08c0-5b56-43e4-b38f-f3097ce870b7/volumes" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.349753 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 08:59:11 crc kubenswrapper[4610]: W1006 08:59:11.365276 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a9f3925_16de_4002_919d_413e1d94a7c0.slice/crio-d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1 WatchSource:0}: Error finding container d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1: Status 404 returned error can't find the container with id d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1 Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.559783 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.588927 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.594674 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.594688 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.594998 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-kg8fw" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.596138 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.603262 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.650569 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7fv\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-kube-api-access-lm7fv\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.650653 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-lock\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.650680 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.650926 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-cache\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.651001 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.751956 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.752161 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7fv\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-kube-api-access-lm7fv\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.752212 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-lock\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.752235 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.752286 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-cache\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: E1006 08:59:11.752228 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:11 crc kubenswrapper[4610]: E1006 08:59:11.752775 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:11 crc kubenswrapper[4610]: E1006 08:59:11.752873 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:12.252815712 +0000 UTC m=+1083.967869150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.752871 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.753057 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-cache\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.753063 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/05c553c8-ced7-4296-b8c5-12b91a953b1d-lock\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.781531 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.784017 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7fv\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-kube-api-access-lm7fv\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.939343 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" event={"ID":"ea422d2b-a77f-4103-948c-d86e75a81c91","Type":"ContainerStarted","Data":"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621"} Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.940320 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.939428 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="dnsmasq-dns" containerID="cri-o://5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621" gracePeriod=10 Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.941715 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-fms24" event={"ID":"e63da204-5b27-4c2f-9e93-668b8d279a19","Type":"ContainerStarted","Data":"ca1464a63ee7fd2bfc9f4b7d75c80759c2915e3d343feb9956c8df1a8ecb0488"} Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.941863 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.946889 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerStarted","Data":"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec"} Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.948717 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" event={"ID":"9a9f3925-16de-4002-919d-413e1d94a7c0","Type":"ContainerStarted","Data":"d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1"} Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.960894 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" podStartSLOduration=4.960872079 podStartE2EDuration="4.960872079s" podCreationTimestamp="2025-10-06 08:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:11.955717232 +0000 UTC m=+1083.670770640" watchObservedRunningTime="2025-10-06 08:59:11.960872079 +0000 UTC m=+1083.675925477" Oct 06 08:59:11 crc kubenswrapper[4610]: I1006 08:59:11.981426 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-fms24" podStartSLOduration=4.981405664 podStartE2EDuration="4.981405664s" podCreationTimestamp="2025-10-06 08:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:11.975857027 +0000 UTC m=+1083.690910405" watchObservedRunningTime="2025-10-06 08:59:11.981405664 +0000 UTC m=+1083.696459052" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.091742 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pv9bk"] Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.093228 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.096025 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.096259 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.096280 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.113659 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pv9bk"] Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.159969 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk5q\" (UniqueName: \"kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160085 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160122 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160199 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160236 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160306 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.160367 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261466 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk5q\" (UniqueName: \"kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261571 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261623 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261641 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261675 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261696 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.261725 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: E1006 08:59:12.261929 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:12 crc kubenswrapper[4610]: E1006 08:59:12.261942 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:12 crc kubenswrapper[4610]: E1006 08:59:12.261976 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:13.261963397 +0000 UTC m=+1084.977016785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.262694 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.263025 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.266029 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.266428 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.267576 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.275823 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.292785 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk5q\" (UniqueName: \"kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q\") pod \"swift-ring-rebalance-pv9bk\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.546800 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.824587 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.871992 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config\") pod \"ea422d2b-a77f-4103-948c-d86e75a81c91\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.872396 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb\") pod \"ea422d2b-a77f-4103-948c-d86e75a81c91\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.872442 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwh9p\" (UniqueName: \"kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p\") pod \"ea422d2b-a77f-4103-948c-d86e75a81c91\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.872476 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc\") pod \"ea422d2b-a77f-4103-948c-d86e75a81c91\" (UID: \"ea422d2b-a77f-4103-948c-d86e75a81c91\") " Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.889730 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p" (OuterVolumeSpecName: "kube-api-access-qwh9p") pod "ea422d2b-a77f-4103-948c-d86e75a81c91" (UID: "ea422d2b-a77f-4103-948c-d86e75a81c91"). InnerVolumeSpecName "kube-api-access-qwh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.943003 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea422d2b-a77f-4103-948c-d86e75a81c91" (UID: "ea422d2b-a77f-4103-948c-d86e75a81c91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.975639 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwh9p\" (UniqueName: \"kubernetes.io/projected/ea422d2b-a77f-4103-948c-d86e75a81c91-kube-api-access-qwh9p\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.975861 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.980485 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f726409a-ab18-426c-84e7-2d8ae473a3d4","Type":"ContainerStarted","Data":"4f19757755c363b3129337376f5fb1651888a3d61be7c7a631433e0ee536f8b8"} Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.988686 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config" (OuterVolumeSpecName: "config") pod "ea422d2b-a77f-4103-948c-d86e75a81c91" (UID: "ea422d2b-a77f-4103-948c-d86e75a81c91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.989369 4610 generic.go:334] "Generic (PLEG): container finished" podID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerID="8600434c25fa68074174302fe7241a279af41c963bcecbf68d85006435562bd5" exitCode=0 Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.989442 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" event={"ID":"9a9f3925-16de-4002-919d-413e1d94a7c0","Type":"ContainerDied","Data":"8600434c25fa68074174302fe7241a279af41c963bcecbf68d85006435562bd5"} Oct 06 08:59:12 crc kubenswrapper[4610]: I1006 08:59:12.991497 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea422d2b-a77f-4103-948c-d86e75a81c91" (UID: "ea422d2b-a77f-4103-948c-d86e75a81c91"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.003392 4610 generic.go:334] "Generic (PLEG): container finished" podID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerID="5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621" exitCode=0 Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.003448 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.003472 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" event={"ID":"ea422d2b-a77f-4103-948c-d86e75a81c91","Type":"ContainerDied","Data":"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621"} Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.007933 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-fljkv" event={"ID":"ea422d2b-a77f-4103-948c-d86e75a81c91","Type":"ContainerDied","Data":"3622af8b1cc0e45fc63b7f9ccd6b030ca29b0e24786b5a0e2c89d70afd21a5ef"} Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.008030 4610 scope.go:117] "RemoveContainer" containerID="5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.067339 4610 scope.go:117] "RemoveContainer" containerID="1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.078221 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.078259 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea422d2b-a77f-4103-948c-d86e75a81c91-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.106146 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.106178 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-fljkv"] Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.155312 4610 scope.go:117] "RemoveContainer" containerID="5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621" Oct 06 08:59:13 crc kubenswrapper[4610]: E1006 08:59:13.156909 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621\": container with ID starting with 5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621 not found: ID does not exist" containerID="5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.156971 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621"} err="failed to get container status \"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621\": rpc error: code = NotFound desc = could not find container \"5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621\": container with ID starting with 5e044d35d16e3eeecd0f61e44e5ffafa5ce4d33f9a6ce41bc985effad9eee621 not found: ID does not exist" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.157005 4610 scope.go:117] "RemoveContainer" containerID="1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358" Oct 06 08:59:13 crc kubenswrapper[4610]: E1006 08:59:13.158184 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358\": container with ID starting with 1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358 not found: ID does not exist" containerID="1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.158247 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358"} err="failed to get container status \"1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358\": rpc error: code = NotFound desc = could not find container \"1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358\": container with ID starting with 1ae3ac5fd9122d8792214ff1dbe885930eec844caaecf85cdec683c413464358 not found: ID does not exist" Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.159543 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pv9bk"] Oct 06 08:59:13 crc kubenswrapper[4610]: W1006 08:59:13.163005 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83179f37_2a3e_4b31_8d5e_fcdaf56961a5.slice/crio-ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe WatchSource:0}: Error finding container ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe: Status 404 returned error can't find the container with id ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe Oct 06 08:59:13 crc kubenswrapper[4610]: I1006 08:59:13.280250 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:13 crc kubenswrapper[4610]: E1006 08:59:13.280453 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:13 crc kubenswrapper[4610]: E1006 08:59:13.280483 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:13 crc kubenswrapper[4610]: E1006 08:59:13.280543 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:15.280515943 +0000 UTC m=+1086.995569331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.019844 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pv9bk" event={"ID":"83179f37-2a3e-4b31-8d5e-fcdaf56961a5","Type":"ContainerStarted","Data":"ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe"} Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.027507 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f726409a-ab18-426c-84e7-2d8ae473a3d4","Type":"ContainerStarted","Data":"d6a026adb0ec5c9e440c4868e14f579c2c425154fccb856268c70492a17b2895"} Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.027727 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.035121 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" event={"ID":"9a9f3925-16de-4002-919d-413e1d94a7c0","Type":"ContainerStarted","Data":"05cd332531da528c7c4b53c3febbebd690c03cf920eb818592abea5271832907"} Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.035295 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.059412 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.668358868 podStartE2EDuration="5.059385681s" podCreationTimestamp="2025-10-06 08:59:09 +0000 UTC" firstStartedPulling="2025-10-06 08:59:10.229395466 +0000 UTC m=+1081.944448854" lastFinishedPulling="2025-10-06 08:59:12.620422269 +0000 UTC m=+1084.335475667" observedRunningTime="2025-10-06 08:59:14.051512642 +0000 UTC m=+1085.766566030" watchObservedRunningTime="2025-10-06 08:59:14.059385681 +0000 UTC m=+1085.774439089" Oct 06 08:59:14 crc kubenswrapper[4610]: I1006 08:59:14.079197 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podStartSLOduration=4.079172227 podStartE2EDuration="4.079172227s" podCreationTimestamp="2025-10-06 08:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:14.075659824 +0000 UTC m=+1085.790713212" watchObservedRunningTime="2025-10-06 08:59:14.079172227 +0000 UTC m=+1085.794225615" Oct 06 08:59:15 crc kubenswrapper[4610]: I1006 08:59:15.082095 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" path="/var/lib/kubelet/pods/ea422d2b-a77f-4103-948c-d86e75a81c91/volumes" Oct 06 08:59:15 crc kubenswrapper[4610]: I1006 08:59:15.314947 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:15 crc kubenswrapper[4610]: E1006 08:59:15.315153 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:15 crc kubenswrapper[4610]: E1006 08:59:15.315194 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:15 crc kubenswrapper[4610]: E1006 08:59:15.315250 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:19.315232331 +0000 UTC m=+1091.030285719 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.446859 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.447153 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.487551 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.819854 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.820351 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 08:59:17 crc kubenswrapper[4610]: I1006 08:59:17.879556 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.108232 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.115640 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.122692 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.441297 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mnlxm"] Oct 06 08:59:18 crc kubenswrapper[4610]: E1006 08:59:18.441746 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="dnsmasq-dns" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.441768 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="dnsmasq-dns" Oct 06 08:59:18 crc kubenswrapper[4610]: E1006 08:59:18.441791 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="init" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.441801 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="init" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.442032 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea422d2b-a77f-4103-948c-d86e75a81c91" containerName="dnsmasq-dns" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.442728 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.457064 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mnlxm"] Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.578112 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6zx\" (UniqueName: \"kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx\") pod \"placement-db-create-mnlxm\" (UID: \"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e\") " pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.588492 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v9lkn"] Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.589693 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.597151 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v9lkn"] Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.679505 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6zx\" (UniqueName: \"kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx\") pod \"placement-db-create-mnlxm\" (UID: \"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e\") " pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.705924 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6zx\" (UniqueName: \"kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx\") pod \"placement-db-create-mnlxm\" (UID: \"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e\") " pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.760297 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.781618 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfw46\" (UniqueName: \"kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46\") pod \"glance-db-create-v9lkn\" (UID: \"fed19e15-b1d8-4a4b-84a5-7cf57afdb998\") " pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.882899 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfw46\" (UniqueName: \"kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46\") pod \"glance-db-create-v9lkn\" (UID: \"fed19e15-b1d8-4a4b-84a5-7cf57afdb998\") " pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.901646 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfw46\" (UniqueName: \"kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46\") pod \"glance-db-create-v9lkn\" (UID: \"fed19e15-b1d8-4a4b-84a5-7cf57afdb998\") " pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:18 crc kubenswrapper[4610]: I1006 08:59:18.905225 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:19 crc kubenswrapper[4610]: I1006 08:59:19.393226 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:19 crc kubenswrapper[4610]: E1006 08:59:19.393457 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:19 crc kubenswrapper[4610]: E1006 08:59:19.393475 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:19 crc kubenswrapper[4610]: E1006 08:59:19.393527 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:27.393509401 +0000 UTC m=+1099.108562799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:20 crc kubenswrapper[4610]: I1006 08:59:20.843165 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 08:59:20 crc kubenswrapper[4610]: I1006 08:59:20.948398 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:20 crc kubenswrapper[4610]: I1006 08:59:20.948929 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-fms24" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="dnsmasq-dns" containerID="cri-o://ca1464a63ee7fd2bfc9f4b7d75c80759c2915e3d343feb9956c8df1a8ecb0488" gracePeriod=10 Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.124692 4610 generic.go:334] "Generic (PLEG): container finished" podID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerID="ca1464a63ee7fd2bfc9f4b7d75c80759c2915e3d343feb9956c8df1a8ecb0488" exitCode=0 Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.124739 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-fms24" event={"ID":"e63da204-5b27-4c2f-9e93-668b8d279a19","Type":"ContainerDied","Data":"ca1464a63ee7fd2bfc9f4b7d75c80759c2915e3d343feb9956c8df1a8ecb0488"} Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.550342 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.642747 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config\") pod \"e63da204-5b27-4c2f-9e93-668b8d279a19\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.642802 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb\") pod \"e63da204-5b27-4c2f-9e93-668b8d279a19\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.642853 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzjc\" (UniqueName: \"kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc\") pod \"e63da204-5b27-4c2f-9e93-668b8d279a19\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.642927 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc\") pod \"e63da204-5b27-4c2f-9e93-668b8d279a19\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.643039 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb\") pod \"e63da204-5b27-4c2f-9e93-668b8d279a19\" (UID: \"e63da204-5b27-4c2f-9e93-668b8d279a19\") " Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.664731 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc" (OuterVolumeSpecName: "kube-api-access-nfzjc") pod "e63da204-5b27-4c2f-9e93-668b8d279a19" (UID: "e63da204-5b27-4c2f-9e93-668b8d279a19"). InnerVolumeSpecName "kube-api-access-nfzjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.682580 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e63da204-5b27-4c2f-9e93-668b8d279a19" (UID: "e63da204-5b27-4c2f-9e93-668b8d279a19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.709750 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e63da204-5b27-4c2f-9e93-668b8d279a19" (UID: "e63da204-5b27-4c2f-9e93-668b8d279a19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.710411 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config" (OuterVolumeSpecName: "config") pod "e63da204-5b27-4c2f-9e93-668b8d279a19" (UID: "e63da204-5b27-4c2f-9e93-668b8d279a19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.733536 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mnlxm"] Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.744522 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.744936 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.744961 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzjc\" (UniqueName: \"kubernetes.io/projected/e63da204-5b27-4c2f-9e93-668b8d279a19-kube-api-access-nfzjc\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.744971 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.745696 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e63da204-5b27-4c2f-9e93-668b8d279a19" (UID: "e63da204-5b27-4c2f-9e93-668b8d279a19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.748014 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v9lkn"] Oct 06 08:59:21 crc kubenswrapper[4610]: I1006 08:59:21.873220 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63da204-5b27-4c2f-9e93-668b8d279a19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.139320 4610 generic.go:334] "Generic (PLEG): container finished" podID="6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" containerID="d0d0d99963c6860f1eb1c1dd0cdab4bfe9ee960295dd0ab2858629f9c5983a09" exitCode=0 Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.139498 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mnlxm" event={"ID":"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e","Type":"ContainerDied","Data":"d0d0d99963c6860f1eb1c1dd0cdab4bfe9ee960295dd0ab2858629f9c5983a09"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.139652 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mnlxm" event={"ID":"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e","Type":"ContainerStarted","Data":"3993c3b9171ec567fd3b2941dc2691f9b8b28c7e8e1d8a6996bba8fb5f2e77c3"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.141253 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-fms24" event={"ID":"e63da204-5b27-4c2f-9e93-668b8d279a19","Type":"ContainerDied","Data":"53e04553a2a9408635fd7a42d4f078fd92fea09d5315fa89d72f5747a6853ea2"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.141291 4610 scope.go:117] "RemoveContainer" containerID="ca1464a63ee7fd2bfc9f4b7d75c80759c2915e3d343feb9956c8df1a8ecb0488" Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.141394 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-fms24" Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.149106 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pv9bk" event={"ID":"83179f37-2a3e-4b31-8d5e-fcdaf56961a5","Type":"ContainerStarted","Data":"4e455cd2e38edf8b8cd82c40386abfa0e9e1e244821c7fbae847527a4b52aff6"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.176131 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9lkn" event={"ID":"fed19e15-b1d8-4a4b-84a5-7cf57afdb998","Type":"ContainerStarted","Data":"6c873aab7d8e1dcbdc3e4aaf32ae629ceac137d8ab1f74a5604fe2a7d1eeb53b"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.176180 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9lkn" event={"ID":"fed19e15-b1d8-4a4b-84a5-7cf57afdb998","Type":"ContainerStarted","Data":"84b42677b58a83dc120cdd1d8ca47debe368bde525f146249cc90ddd459aed1b"} Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.177599 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pv9bk" podStartSLOduration=2.19260631 podStartE2EDuration="10.177588634s" podCreationTimestamp="2025-10-06 08:59:12 +0000 UTC" firstStartedPulling="2025-10-06 08:59:13.166262368 +0000 UTC m=+1084.881315756" lastFinishedPulling="2025-10-06 08:59:21.151244692 +0000 UTC m=+1092.866298080" observedRunningTime="2025-10-06 08:59:22.170069894 +0000 UTC m=+1093.885123282" watchObservedRunningTime="2025-10-06 08:59:22.177588634 +0000 UTC m=+1093.892642022" Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.185099 4610 scope.go:117] "RemoveContainer" containerID="8c3519d0da4c9ecb9bd7506c09388d8f3384dfad7ec1f20bd3b67f63a8802229" Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.211099 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.254429 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-fms24"] Oct 06 08:59:22 crc kubenswrapper[4610]: I1006 08:59:22.258191 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-v9lkn" podStartSLOduration=4.258177104 podStartE2EDuration="4.258177104s" podCreationTimestamp="2025-10-06 08:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:22.226090652 +0000 UTC m=+1093.941144040" watchObservedRunningTime="2025-10-06 08:59:22.258177104 +0000 UTC m=+1093.973230492" Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.079759 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" path="/var/lib/kubelet/pods/e63da204-5b27-4c2f-9e93-668b8d279a19/volumes" Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.188249 4610 generic.go:334] "Generic (PLEG): container finished" podID="fed19e15-b1d8-4a4b-84a5-7cf57afdb998" containerID="6c873aab7d8e1dcbdc3e4aaf32ae629ceac137d8ab1f74a5604fe2a7d1eeb53b" exitCode=0 Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.188305 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9lkn" event={"ID":"fed19e15-b1d8-4a4b-84a5-7cf57afdb998","Type":"ContainerDied","Data":"6c873aab7d8e1dcbdc3e4aaf32ae629ceac137d8ab1f74a5604fe2a7d1eeb53b"} Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.550571 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.616836 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql6zx\" (UniqueName: \"kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx\") pod \"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e\" (UID: \"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e\") " Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.633217 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx" (OuterVolumeSpecName: "kube-api-access-ql6zx") pod "6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" (UID: "6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e"). InnerVolumeSpecName "kube-api-access-ql6zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:23 crc kubenswrapper[4610]: I1006 08:59:23.718402 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql6zx\" (UniqueName: \"kubernetes.io/projected/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e-kube-api-access-ql6zx\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.198880 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mnlxm" event={"ID":"6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e","Type":"ContainerDied","Data":"3993c3b9171ec567fd3b2941dc2691f9b8b28c7e8e1d8a6996bba8fb5f2e77c3"} Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.198924 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3993c3b9171ec567fd3b2941dc2691f9b8b28c7e8e1d8a6996bba8fb5f2e77c3" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.199097 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mnlxm" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.647784 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.765902 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.834933 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfw46\" (UniqueName: \"kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46\") pod \"fed19e15-b1d8-4a4b-84a5-7cf57afdb998\" (UID: \"fed19e15-b1d8-4a4b-84a5-7cf57afdb998\") " Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.842435 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46" (OuterVolumeSpecName: "kube-api-access-nfw46") pod "fed19e15-b1d8-4a4b-84a5-7cf57afdb998" (UID: "fed19e15-b1d8-4a4b-84a5-7cf57afdb998"). InnerVolumeSpecName "kube-api-access-nfw46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:24 crc kubenswrapper[4610]: I1006 08:59:24.937222 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfw46\" (UniqueName: \"kubernetes.io/projected/fed19e15-b1d8-4a4b-84a5-7cf57afdb998-kube-api-access-nfw46\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:25 crc kubenswrapper[4610]: I1006 08:59:25.229978 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v9lkn" event={"ID":"fed19e15-b1d8-4a4b-84a5-7cf57afdb998","Type":"ContainerDied","Data":"84b42677b58a83dc120cdd1d8ca47debe368bde525f146249cc90ddd459aed1b"} Oct 06 08:59:25 crc kubenswrapper[4610]: I1006 08:59:25.230015 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b42677b58a83dc120cdd1d8ca47debe368bde525f146249cc90ddd459aed1b" Oct 06 08:59:25 crc kubenswrapper[4610]: I1006 08:59:25.230108 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v9lkn" Oct 06 08:59:27 crc kubenswrapper[4610]: I1006 08:59:27.480825 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:27 crc kubenswrapper[4610]: E1006 08:59:27.480988 4610 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:59:27 crc kubenswrapper[4610]: E1006 08:59:27.482294 4610 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:59:27 crc kubenswrapper[4610]: E1006 08:59:27.482347 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift podName:05c553c8-ced7-4296-b8c5-12b91a953b1d nodeName:}" failed. No retries permitted until 2025-10-06 08:59:43.482331564 +0000 UTC m=+1115.197384952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift") pod "swift-storage-0" (UID: "05c553c8-ced7-4296-b8c5-12b91a953b1d") : configmap "swift-ring-files" not found Oct 06 08:59:27 crc kubenswrapper[4610]: I1006 08:59:27.999119 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jhdzb"] Oct 06 08:59:27 crc kubenswrapper[4610]: E1006 08:59:27.999900 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="dnsmasq-dns" Oct 06 08:59:27 crc kubenswrapper[4610]: I1006 08:59:27.999972 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="dnsmasq-dns" Oct 06 08:59:28 crc kubenswrapper[4610]: E1006 08:59:28.000032 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="init" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000107 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="init" Oct 06 08:59:28 crc kubenswrapper[4610]: E1006 08:59:28.000181 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed19e15-b1d8-4a4b-84a5-7cf57afdb998" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000233 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed19e15-b1d8-4a4b-84a5-7cf57afdb998" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: E1006 08:59:28.000296 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000351 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000546 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000609 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63da204-5b27-4c2f-9e93-668b8d279a19" containerName="dnsmasq-dns" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.000658 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed19e15-b1d8-4a4b-84a5-7cf57afdb998" containerName="mariadb-database-create" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.001242 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.006780 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jhdzb"] Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.089912 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmqd\" (UniqueName: \"kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd\") pod \"keystone-db-create-jhdzb\" (UID: \"0e201fa9-8437-4672-9561-5eec037869f4\") " pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.190830 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmqd\" (UniqueName: \"kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd\") pod \"keystone-db-create-jhdzb\" (UID: \"0e201fa9-8437-4672-9561-5eec037869f4\") " pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.217943 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmqd\" (UniqueName: \"kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd\") pod \"keystone-db-create-jhdzb\" (UID: \"0e201fa9-8437-4672-9561-5eec037869f4\") " pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.343372 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:28 crc kubenswrapper[4610]: I1006 08:59:28.898115 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jhdzb"] Oct 06 08:59:28 crc kubenswrapper[4610]: W1006 08:59:28.908010 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e201fa9_8437_4672_9561_5eec037869f4.slice/crio-531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76 WatchSource:0}: Error finding container 531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76: Status 404 returned error can't find the container with id 531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76 Oct 06 08:59:29 crc kubenswrapper[4610]: I1006 08:59:29.259399 4610 generic.go:334] "Generic (PLEG): container finished" podID="0e201fa9-8437-4672-9561-5eec037869f4" containerID="3198771bf3833d1503552fcf70333e19f97d0ba4391a828e86d22ad4df5a3d76" exitCode=0 Oct 06 08:59:29 crc kubenswrapper[4610]: I1006 08:59:29.259441 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jhdzb" event={"ID":"0e201fa9-8437-4672-9561-5eec037869f4","Type":"ContainerDied","Data":"3198771bf3833d1503552fcf70333e19f97d0ba4391a828e86d22ad4df5a3d76"} Oct 06 08:59:29 crc kubenswrapper[4610]: I1006 08:59:29.259739 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jhdzb" event={"ID":"0e201fa9-8437-4672-9561-5eec037869f4","Type":"ContainerStarted","Data":"531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76"} Oct 06 08:59:29 crc kubenswrapper[4610]: I1006 08:59:29.261470 4610 generic.go:334] "Generic (PLEG): container finished" podID="83179f37-2a3e-4b31-8d5e-fcdaf56961a5" containerID="4e455cd2e38edf8b8cd82c40386abfa0e9e1e244821c7fbae847527a4b52aff6" exitCode=0 Oct 06 08:59:29 crc kubenswrapper[4610]: I1006 08:59:29.261530 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pv9bk" event={"ID":"83179f37-2a3e-4b31-8d5e-fcdaf56961a5","Type":"ContainerDied","Data":"4e455cd2e38edf8b8cd82c40386abfa0e9e1e244821c7fbae847527a4b52aff6"} Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.671086 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.745761 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.835909 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836012 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836114 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836195 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836235 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836276 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.836331 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wk5q\" (UniqueName: \"kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.837094 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.837844 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.849429 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q" (OuterVolumeSpecName: "kube-api-access-9wk5q") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "kube-api-access-9wk5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.855609 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.863441 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: E1006 08:59:30.867379 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf podName:83179f37-2a3e-4b31-8d5e-fcdaf56961a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:59:31.367354051 +0000 UTC m=+1103.082407439 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5") : error deleting /var/lib/kubelet/pods/83179f37-2a3e-4b31-8d5e-fcdaf56961a5/volume-subpaths: remove /var/lib/kubelet/pods/83179f37-2a3e-4b31-8d5e-fcdaf56961a5/volume-subpaths: no such file or directory Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.867673 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts" (OuterVolumeSpecName: "scripts") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.937955 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmmqd\" (UniqueName: \"kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd\") pod \"0e201fa9-8437-4672-9561-5eec037869f4\" (UID: \"0e201fa9-8437-4672-9561-5eec037869f4\") " Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938689 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wk5q\" (UniqueName: \"kubernetes.io/projected/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-kube-api-access-9wk5q\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938714 4610 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938727 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938741 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938752 4610 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.938766 4610 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:30 crc kubenswrapper[4610]: I1006 08:59:30.940367 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd" (OuterVolumeSpecName: "kube-api-access-jmmqd") pod "0e201fa9-8437-4672-9561-5eec037869f4" (UID: "0e201fa9-8437-4672-9561-5eec037869f4"). InnerVolumeSpecName "kube-api-access-jmmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.039700 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmmqd\" (UniqueName: \"kubernetes.io/projected/0e201fa9-8437-4672-9561-5eec037869f4-kube-api-access-jmmqd\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.289218 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pv9bk" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.289529 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pv9bk" event={"ID":"83179f37-2a3e-4b31-8d5e-fcdaf56961a5","Type":"ContainerDied","Data":"ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe"} Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.289549 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8ed502ea627d2467189e8e96650aa5d2b478be4ad0ec6c1018239bc5123dbe" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.290912 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jhdzb" event={"ID":"0e201fa9-8437-4672-9561-5eec037869f4","Type":"ContainerDied","Data":"531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76"} Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.290938 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531fecea6caef75fe0ea5ce9d3c06569725bdbb9b2a9486369413392fa0cfb76" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.290983 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jhdzb" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.444179 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") pod \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\" (UID: \"83179f37-2a3e-4b31-8d5e-fcdaf56961a5\") " Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.447428 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "83179f37-2a3e-4b31-8d5e-fcdaf56961a5" (UID: "83179f37-2a3e-4b31-8d5e-fcdaf56961a5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:59:31 crc kubenswrapper[4610]: I1006 08:59:31.545714 4610 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/83179f37-2a3e-4b31-8d5e-fcdaf56961a5-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.184936 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6hjff" podUID="1e77ce11-f629-48ab-820e-e67fbfc3ba57" containerName="ovn-controller" probeResult="failure" output=< Oct 06 08:59:35 crc kubenswrapper[4610]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 08:59:35 crc kubenswrapper[4610]: > Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.192512 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.207037 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pfhq5" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.419447 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6hjff-config-l6z7q"] Oct 06 08:59:35 crc kubenswrapper[4610]: E1006 08:59:35.420124 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e201fa9-8437-4672-9561-5eec037869f4" containerName="mariadb-database-create" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.420143 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e201fa9-8437-4672-9561-5eec037869f4" containerName="mariadb-database-create" Oct 06 08:59:35 crc kubenswrapper[4610]: E1006 08:59:35.420182 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83179f37-2a3e-4b31-8d5e-fcdaf56961a5" containerName="swift-ring-rebalance" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.420191 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="83179f37-2a3e-4b31-8d5e-fcdaf56961a5" containerName="swift-ring-rebalance" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.420390 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e201fa9-8437-4672-9561-5eec037869f4" containerName="mariadb-database-create" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.420422 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="83179f37-2a3e-4b31-8d5e-fcdaf56961a5" containerName="swift-ring-rebalance" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.421037 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.437765 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.476969 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff-config-l6z7q"] Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516580 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516622 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516680 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516719 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516774 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.516797 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxbm\" (UniqueName: \"kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.617865 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618119 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxbm\" (UniqueName: \"kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618213 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618322 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618398 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618496 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618578 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618440 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.618723 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.619232 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.620450 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.635768 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxbm\" (UniqueName: \"kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm\") pod \"ovn-controller-6hjff-config-l6z7q\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:35 crc kubenswrapper[4610]: I1006 08:59:35.747225 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:36 crc kubenswrapper[4610]: I1006 08:59:36.254994 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff-config-l6z7q"] Oct 06 08:59:36 crc kubenswrapper[4610]: I1006 08:59:36.327924 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-l6z7q" event={"ID":"8538eb53-c97f-46c2-aad9-e2e24396b313","Type":"ContainerStarted","Data":"af6f5964dba35c0061c0bdfad060bbf8e7e4ae5ce2ba99c523b9f2b22f4ab026"} Oct 06 08:59:37 crc kubenswrapper[4610]: I1006 08:59:37.336251 4610 generic.go:334] "Generic (PLEG): container finished" podID="8538eb53-c97f-46c2-aad9-e2e24396b313" containerID="37f7457232ba5abeb6c1eda06e9027474218e6abea335caccd389835d4d822b7" exitCode=0 Oct 06 08:59:37 crc kubenswrapper[4610]: I1006 08:59:37.336359 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-l6z7q" event={"ID":"8538eb53-c97f-46c2-aad9-e2e24396b313","Type":"ContainerDied","Data":"37f7457232ba5abeb6c1eda06e9027474218e6abea335caccd389835d4d822b7"} Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.106032 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-09e4-account-create-dvf9l"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.107572 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.110865 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.114957 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-09e4-account-create-dvf9l"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.263188 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtsq\" (UniqueName: \"kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq\") pod \"keystone-09e4-account-create-dvf9l\" (UID: \"4f6ed3df-0417-488a-a4e2-26ab08498a9f\") " pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.364689 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtsq\" (UniqueName: \"kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq\") pod \"keystone-09e4-account-create-dvf9l\" (UID: \"4f6ed3df-0417-488a-a4e2-26ab08498a9f\") " pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.390682 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtsq\" (UniqueName: \"kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq\") pod \"keystone-09e4-account-create-dvf9l\" (UID: \"4f6ed3df-0417-488a-a4e2-26ab08498a9f\") " pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.432747 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-53ab-account-create-rwc72"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.433991 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.436685 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.449718 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.455409 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-53ab-account-create-rwc72"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.568588 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xql6p\" (UniqueName: \"kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p\") pod \"placement-53ab-account-create-rwc72\" (UID: \"56ca654c-0a66-443d-97d2-4788f5738c56\") " pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.670088 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xql6p\" (UniqueName: \"kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p\") pod \"placement-53ab-account-create-rwc72\" (UID: \"56ca654c-0a66-443d-97d2-4788f5738c56\") " pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.705591 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xql6p\" (UniqueName: \"kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p\") pod \"placement-53ab-account-create-rwc72\" (UID: \"56ca654c-0a66-443d-97d2-4788f5738c56\") " pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.755963 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1c22-account-create-s6pwd"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.757023 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.757735 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.759445 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.784928 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1c22-account-create-s6pwd"] Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.786227 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.873789 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.874159 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875252 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875317 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875371 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdxbm\" (UniqueName: \"kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875401 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run\") pod \"8538eb53-c97f-46c2-aad9-e2e24396b313\" (UID: \"8538eb53-c97f-46c2-aad9-e2e24396b313\") " Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.873949 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875141 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts" (OuterVolumeSpecName: "scripts") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875656 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.875795 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run" (OuterVolumeSpecName: "var-run") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876432 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876648 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgzw6\" (UniqueName: \"kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6\") pod \"glance-1c22-account-create-s6pwd\" (UID: \"afb232da-0946-4113-9a7a-1aaea2706f8a\") " pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876907 4610 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876923 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876934 4610 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876945 4610 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8538eb53-c97f-46c2-aad9-e2e24396b313-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.876957 4610 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8538eb53-c97f-46c2-aad9-e2e24396b313-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.879788 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm" (OuterVolumeSpecName: "kube-api-access-bdxbm") pod "8538eb53-c97f-46c2-aad9-e2e24396b313" (UID: "8538eb53-c97f-46c2-aad9-e2e24396b313"). InnerVolumeSpecName "kube-api-access-bdxbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.950537 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-09e4-account-create-dvf9l"] Oct 06 08:59:38 crc kubenswrapper[4610]: W1006 08:59:38.954472 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6ed3df_0417_488a_a4e2_26ab08498a9f.slice/crio-916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054 WatchSource:0}: Error finding container 916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054: Status 404 returned error can't find the container with id 916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054 Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.978172 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgzw6\" (UniqueName: \"kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6\") pod \"glance-1c22-account-create-s6pwd\" (UID: \"afb232da-0946-4113-9a7a-1aaea2706f8a\") " pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.978251 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdxbm\" (UniqueName: \"kubernetes.io/projected/8538eb53-c97f-46c2-aad9-e2e24396b313-kube-api-access-bdxbm\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:38 crc kubenswrapper[4610]: I1006 08:59:38.994814 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgzw6\" (UniqueName: \"kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6\") pod \"glance-1c22-account-create-s6pwd\" (UID: \"afb232da-0946-4113-9a7a-1aaea2706f8a\") " pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.112909 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.231307 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-53ab-account-create-rwc72"] Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.351888 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53ab-account-create-rwc72" event={"ID":"56ca654c-0a66-443d-97d2-4788f5738c56","Type":"ContainerStarted","Data":"b8646d093961843b8451f9530d0198e67a5a3f3012947ed7e303f0f52739828d"} Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.359082 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-l6z7q" event={"ID":"8538eb53-c97f-46c2-aad9-e2e24396b313","Type":"ContainerDied","Data":"af6f5964dba35c0061c0bdfad060bbf8e7e4ae5ce2ba99c523b9f2b22f4ab026"} Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.359130 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af6f5964dba35c0061c0bdfad060bbf8e7e4ae5ce2ba99c523b9f2b22f4ab026" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.359146 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-l6z7q" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.361685 4610 generic.go:334] "Generic (PLEG): container finished" podID="4f6ed3df-0417-488a-a4e2-26ab08498a9f" containerID="1ff080250b301a7b5da06d570b76228da02dbfcfacca82762233fe1364cac112" exitCode=0 Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.361726 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-09e4-account-create-dvf9l" event={"ID":"4f6ed3df-0417-488a-a4e2-26ab08498a9f","Type":"ContainerDied","Data":"1ff080250b301a7b5da06d570b76228da02dbfcfacca82762233fe1364cac112"} Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.361783 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-09e4-account-create-dvf9l" event={"ID":"4f6ed3df-0417-488a-a4e2-26ab08498a9f","Type":"ContainerStarted","Data":"916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054"} Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.552860 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1c22-account-create-s6pwd"] Oct 06 08:59:39 crc kubenswrapper[4610]: W1006 08:59:39.564497 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb232da_0946_4113_9a7a_1aaea2706f8a.slice/crio-9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6 WatchSource:0}: Error finding container 9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6: Status 404 returned error can't find the container with id 9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6 Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.895374 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6hjff-config-l6z7q"] Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.904107 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6hjff-config-l6z7q"] Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.997393 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6hjff-config-rdpkz"] Oct 06 08:59:39 crc kubenswrapper[4610]: E1006 08:59:39.998025 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8538eb53-c97f-46c2-aad9-e2e24396b313" containerName="ovn-config" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.998056 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8538eb53-c97f-46c2-aad9-e2e24396b313" containerName="ovn-config" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.998210 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8538eb53-c97f-46c2-aad9-e2e24396b313" containerName="ovn-config" Oct 06 08:59:39 crc kubenswrapper[4610]: I1006 08:59:39.998772 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.002121 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.020037 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff-config-rdpkz"] Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095404 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095516 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095549 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095587 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095711 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.095734 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8td5\" (UniqueName: \"kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196730 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196773 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196797 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196822 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196912 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.196928 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8td5\" (UniqueName: \"kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.197077 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.197106 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.197709 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.197761 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.198993 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.205314 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6hjff" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.224017 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8td5\" (UniqueName: \"kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5\") pod \"ovn-controller-6hjff-config-rdpkz\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.318189 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.381431 4610 generic.go:334] "Generic (PLEG): container finished" podID="56ca654c-0a66-443d-97d2-4788f5738c56" containerID="23d7f95edbc46b8ce7475b4ca3b2b2568bc9511ff9d1d2b80284a5f1515d2b96" exitCode=0 Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.381543 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53ab-account-create-rwc72" event={"ID":"56ca654c-0a66-443d-97d2-4788f5738c56","Type":"ContainerDied","Data":"23d7f95edbc46b8ce7475b4ca3b2b2568bc9511ff9d1d2b80284a5f1515d2b96"} Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.384175 4610 generic.go:334] "Generic (PLEG): container finished" podID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerID="435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f" exitCode=0 Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.384229 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerDied","Data":"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f"} Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.386348 4610 generic.go:334] "Generic (PLEG): container finished" podID="afb232da-0946-4113-9a7a-1aaea2706f8a" containerID="4ffcc846c9a54e89ab72cc1bf3c89472d0835a2af9952e5fda2177a252f7ba3e" exitCode=0 Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.386551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c22-account-create-s6pwd" event={"ID":"afb232da-0946-4113-9a7a-1aaea2706f8a","Type":"ContainerDied","Data":"4ffcc846c9a54e89ab72cc1bf3c89472d0835a2af9952e5fda2177a252f7ba3e"} Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.386580 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c22-account-create-s6pwd" event={"ID":"afb232da-0946-4113-9a7a-1aaea2706f8a","Type":"ContainerStarted","Data":"9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6"} Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.884242 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.928564 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtsq\" (UniqueName: \"kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq\") pod \"4f6ed3df-0417-488a-a4e2-26ab08498a9f\" (UID: \"4f6ed3df-0417-488a-a4e2-26ab08498a9f\") " Oct 06 08:59:40 crc kubenswrapper[4610]: I1006 08:59:40.940305 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq" (OuterVolumeSpecName: "kube-api-access-sdtsq") pod "4f6ed3df-0417-488a-a4e2-26ab08498a9f" (UID: "4f6ed3df-0417-488a-a4e2-26ab08498a9f"). InnerVolumeSpecName "kube-api-access-sdtsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.030041 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtsq\" (UniqueName: \"kubernetes.io/projected/4f6ed3df-0417-488a-a4e2-26ab08498a9f-kube-api-access-sdtsq\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.082254 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8538eb53-c97f-46c2-aad9-e2e24396b313" path="/var/lib/kubelet/pods/8538eb53-c97f-46c2-aad9-e2e24396b313/volumes" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.114811 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6hjff-config-rdpkz"] Oct 06 08:59:41 crc kubenswrapper[4610]: W1006 08:59:41.129892 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d7110_d3e0_4ee7_918a_d6e4daef330b.slice/crio-be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d WatchSource:0}: Error finding container be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d: Status 404 returned error can't find the container with id be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.396169 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-rdpkz" event={"ID":"ac8d7110-d3e0-4ee7-918a-d6e4daef330b","Type":"ContainerStarted","Data":"be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d"} Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.398061 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerStarted","Data":"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4"} Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.398324 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.399163 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-09e4-account-create-dvf9l" event={"ID":"4f6ed3df-0417-488a-a4e2-26ab08498a9f","Type":"ContainerDied","Data":"916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054"} Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.399196 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916d1949abc653a4b668052ef823d684735c54d213f64cbed285c59d0250b054" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.399274 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-09e4-account-create-dvf9l" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.913092 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.919152 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.930893 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.476678264 podStartE2EDuration="1m18.930871782s" podCreationTimestamp="2025-10-06 08:58:23 +0000 UTC" firstStartedPulling="2025-10-06 08:58:25.17967913 +0000 UTC m=+1036.894732518" lastFinishedPulling="2025-10-06 08:59:06.633872648 +0000 UTC m=+1078.348926036" observedRunningTime="2025-10-06 08:59:41.45195968 +0000 UTC m=+1113.167013088" watchObservedRunningTime="2025-10-06 08:59:41.930871782 +0000 UTC m=+1113.645925170" Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.954508 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgzw6\" (UniqueName: \"kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6\") pod \"afb232da-0946-4113-9a7a-1aaea2706f8a\" (UID: \"afb232da-0946-4113-9a7a-1aaea2706f8a\") " Oct 06 08:59:41 crc kubenswrapper[4610]: I1006 08:59:41.963387 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6" (OuterVolumeSpecName: "kube-api-access-lgzw6") pod "afb232da-0946-4113-9a7a-1aaea2706f8a" (UID: "afb232da-0946-4113-9a7a-1aaea2706f8a"). InnerVolumeSpecName "kube-api-access-lgzw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.056687 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xql6p\" (UniqueName: \"kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p\") pod \"56ca654c-0a66-443d-97d2-4788f5738c56\" (UID: \"56ca654c-0a66-443d-97d2-4788f5738c56\") " Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.057116 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgzw6\" (UniqueName: \"kubernetes.io/projected/afb232da-0946-4113-9a7a-1aaea2706f8a-kube-api-access-lgzw6\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.059763 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p" (OuterVolumeSpecName: "kube-api-access-xql6p") pod "56ca654c-0a66-443d-97d2-4788f5738c56" (UID: "56ca654c-0a66-443d-97d2-4788f5738c56"). InnerVolumeSpecName "kube-api-access-xql6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.158388 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xql6p\" (UniqueName: \"kubernetes.io/projected/56ca654c-0a66-443d-97d2-4788f5738c56-kube-api-access-xql6p\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.407491 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53ab-account-create-rwc72" event={"ID":"56ca654c-0a66-443d-97d2-4788f5738c56","Type":"ContainerDied","Data":"b8646d093961843b8451f9530d0198e67a5a3f3012947ed7e303f0f52739828d"} Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.407754 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8646d093961843b8451f9530d0198e67a5a3f3012947ed7e303f0f52739828d" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.408746 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53ab-account-create-rwc72" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.408960 4610 generic.go:334] "Generic (PLEG): container finished" podID="ac8d7110-d3e0-4ee7-918a-d6e4daef330b" containerID="49e2b9439c917bab9422d724bf8134ba7c14f441da31953fdf53987e82958c4e" exitCode=0 Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.409029 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-rdpkz" event={"ID":"ac8d7110-d3e0-4ee7-918a-d6e4daef330b","Type":"ContainerDied","Data":"49e2b9439c917bab9422d724bf8134ba7c14f441da31953fdf53987e82958c4e"} Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.410328 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1c22-account-create-s6pwd" Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.410370 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1c22-account-create-s6pwd" event={"ID":"afb232da-0946-4113-9a7a-1aaea2706f8a","Type":"ContainerDied","Data":"9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6"} Oct 06 08:59:42 crc kubenswrapper[4610]: I1006 08:59:42.410422 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d62cb158cb47b6ddf12df5fcc8198a591bea703863065fbe1c6b5dfa56827a6" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.419778 4610 generic.go:334] "Generic (PLEG): container finished" podID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerID="dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec" exitCode=0 Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.420210 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerDied","Data":"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec"} Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.496268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.504799 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/05c553c8-ced7-4296-b8c5-12b91a953b1d-etc-swift\") pod \"swift-storage-0\" (UID: \"05c553c8-ced7-4296-b8c5-12b91a953b1d\") " pod="openstack/swift-storage-0" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.723437 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.767505 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801065 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801350 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801451 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801539 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8td5\" (UniqueName: \"kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801655 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801782 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn\") pod \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\" (UID: \"ac8d7110-d3e0-4ee7-918a-d6e4daef330b\") " Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.801281 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run" (OuterVolumeSpecName: "var-run") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.802190 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.803419 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts" (OuterVolumeSpecName: "scripts") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.803518 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.804404 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.807014 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5" (OuterVolumeSpecName: "kube-api-access-t8td5") pod "ac8d7110-d3e0-4ee7-918a-d6e4daef330b" (UID: "ac8d7110-d3e0-4ee7-918a-d6e4daef330b"). InnerVolumeSpecName "kube-api-access-t8td5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904296 4610 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904488 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904595 4610 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904650 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8td5\" (UniqueName: \"kubernetes.io/projected/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-kube-api-access-t8td5\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904703 4610 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.904752 4610 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac8d7110-d3e0-4ee7-918a-d6e4daef330b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.955856 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tqdxt"] Oct 06 08:59:43 crc kubenswrapper[4610]: E1006 08:59:43.956367 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb232da-0946-4113-9a7a-1aaea2706f8a" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956402 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb232da-0946-4113-9a7a-1aaea2706f8a" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: E1006 08:59:43.956422 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ca654c-0a66-443d-97d2-4788f5738c56" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956432 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ca654c-0a66-443d-97d2-4788f5738c56" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: E1006 08:59:43.956457 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6ed3df-0417-488a-a4e2-26ab08498a9f" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956465 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6ed3df-0417-488a-a4e2-26ab08498a9f" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: E1006 08:59:43.956475 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8d7110-d3e0-4ee7-918a-d6e4daef330b" containerName="ovn-config" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956482 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8d7110-d3e0-4ee7-918a-d6e4daef330b" containerName="ovn-config" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956691 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ca654c-0a66-443d-97d2-4788f5738c56" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956713 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6ed3df-0417-488a-a4e2-26ab08498a9f" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956731 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb232da-0946-4113-9a7a-1aaea2706f8a" containerName="mariadb-account-create" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.956742 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8d7110-d3e0-4ee7-918a-d6e4daef330b" containerName="ovn-config" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.957473 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.966914 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-59rpw" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.967243 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 08:59:43 crc kubenswrapper[4610]: I1006 08:59:43.974820 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tqdxt"] Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.010436 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdmg\" (UniqueName: \"kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.010502 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.010526 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.010576 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.112009 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.112181 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdmg\" (UniqueName: \"kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.112252 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.112286 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.119808 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.130885 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.131479 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.145791 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdmg\" (UniqueName: \"kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg\") pod \"glance-db-sync-tqdxt\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.296149 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tqdxt" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.445567 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerStarted","Data":"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7"} Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.445827 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.457656 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6hjff-config-rdpkz" event={"ID":"ac8d7110-d3e0-4ee7-918a-d6e4daef330b","Type":"ContainerDied","Data":"be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d"} Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.457900 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8e322f6832e8986de0395ee852d3d7d8536c10dfe5f63f01bc2142e4a0509d" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.458080 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6hjff-config-rdpkz" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.513674 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371955.341127 podStartE2EDuration="1m21.513648658s" podCreationTimestamp="2025-10-06 08:58:23 +0000 UTC" firstStartedPulling="2025-10-06 08:58:25.387265723 +0000 UTC m=+1037.102319111" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:59:44.503544599 +0000 UTC m=+1116.218597987" watchObservedRunningTime="2025-10-06 08:59:44.513648658 +0000 UTC m=+1116.228702046" Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.682137 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.845403 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6hjff-config-rdpkz"] Oct 06 08:59:44 crc kubenswrapper[4610]: I1006 08:59:44.853565 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6hjff-config-rdpkz"] Oct 06 08:59:45 crc kubenswrapper[4610]: I1006 08:59:45.052771 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tqdxt"] Oct 06 08:59:45 crc kubenswrapper[4610]: W1006 08:59:45.062398 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae19c152_10a1_47dd_bb19_c00bf79b56c5.slice/crio-46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9 WatchSource:0}: Error finding container 46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9: Status 404 returned error can't find the container with id 46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9 Oct 06 08:59:45 crc kubenswrapper[4610]: I1006 08:59:45.086484 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8d7110-d3e0-4ee7-918a-d6e4daef330b" path="/var/lib/kubelet/pods/ac8d7110-d3e0-4ee7-918a-d6e4daef330b/volumes" Oct 06 08:59:45 crc kubenswrapper[4610]: I1006 08:59:45.479160 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tqdxt" event={"ID":"ae19c152-10a1-47dd-bb19-c00bf79b56c5","Type":"ContainerStarted","Data":"46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9"} Oct 06 08:59:45 crc kubenswrapper[4610]: I1006 08:59:45.481830 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"eb2fb15bf716fc43e425b45b6806c87266b5da47f2536a284ec951ec962f6a35"} Oct 06 08:59:46 crc kubenswrapper[4610]: I1006 08:59:46.469558 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:59:46 crc kubenswrapper[4610]: I1006 08:59:46.469902 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:59:46 crc kubenswrapper[4610]: I1006 08:59:46.489505 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"74bde5afc2ccff9fc01c9b9e692d82af3e9eba6829658cc7a92e9ce674035fe4"} Oct 06 08:59:47 crc kubenswrapper[4610]: I1006 08:59:47.499930 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"ccc16a378742d41ac8fd42dc3467df7059d5e61184f51c14ecf444b5d8a95f3d"} Oct 06 08:59:47 crc kubenswrapper[4610]: I1006 08:59:47.499969 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"d25c6e0bc8ead08ea924c070e0641ac508e71cf79ee929d4f175dd709280397a"} Oct 06 08:59:47 crc kubenswrapper[4610]: I1006 08:59:47.499978 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"b23270b2dec14671a4cab73a1c2b557ce6e838ef1f3b83d87d48005b83150322"} Oct 06 08:59:49 crc kubenswrapper[4610]: I1006 08:59:49.530335 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"df3e3cd92a705c76ffbecd4975b56ced2f871d4128f10b28e0209ed5d9b15e55"} Oct 06 08:59:49 crc kubenswrapper[4610]: I1006 08:59:49.530675 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"3716e3e72b696334ab340dcde415a7ddfcea71bf20e5e135cc276c2567df20b9"} Oct 06 08:59:49 crc kubenswrapper[4610]: I1006 08:59:49.530686 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"0f6bc5ba8035599ed7d6b056364610b103f5e13031cf089ab489debcbcf91841"} Oct 06 08:59:49 crc kubenswrapper[4610]: I1006 08:59:49.530695 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"7c6d2becb06416e10c7ffae5183a981c21ed921e50fefdc3c8b7553a82225ac7"} Oct 06 08:59:51 crc kubenswrapper[4610]: I1006 08:59:51.569448 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"a92300560d9844e598441a13712cbe94fbb2e9031f8df974ce43994cd5695781"} Oct 06 08:59:51 crc kubenswrapper[4610]: I1006 08:59:51.569853 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"743eb2e3cd1d4b49953632fa2cd3b1101d58c0034852ae0cc195b6eb14a6a5df"} Oct 06 08:59:51 crc kubenswrapper[4610]: I1006 08:59:51.569863 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"2d2d8f4d0eb8c656ff2170d0cd3c5d1033b10a7117d96ccb23e1838880d28e16"} Oct 06 08:59:51 crc kubenswrapper[4610]: I1006 08:59:51.569871 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"98a8ecd8b5cfeb234a1261e3107be6fd76a8546acee65d55a16c1b9e4318fb41"} Oct 06 08:59:52 crc kubenswrapper[4610]: I1006 08:59:52.629676 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"6e2c4aad8355473a8070c346926733b57d56e50fbcd60f916c7d60cf3ea43487"} Oct 06 08:59:52 crc kubenswrapper[4610]: I1006 08:59:52.629968 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"cbefdbe4dfa77729e3d05f9a75f63df7e3e3857fc4456f745309ffd88f136450"} Oct 06 08:59:54 crc kubenswrapper[4610]: I1006 08:59:54.689307 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.115291 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.146572 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-56r7l"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.153539 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56r7l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.174312 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56r7l"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.288920 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bx54l"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.289859 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bx54l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.325367 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bx54l"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.330733 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2w6v\" (UniqueName: \"kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v\") pod \"cinder-db-create-56r7l\" (UID: \"8c8040cf-2183-4b30-9a60-9b630ca829ea\") " pod="openstack/cinder-db-create-56r7l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.433936 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2w6v\" (UniqueName: \"kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v\") pod \"cinder-db-create-56r7l\" (UID: \"8c8040cf-2183-4b30-9a60-9b630ca829ea\") " pod="openstack/cinder-db-create-56r7l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.434076 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257n5\" (UniqueName: \"kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5\") pod \"barbican-db-create-bx54l\" (UID: \"b9d404c7-55ef-4ce8-bb45-1a79923b8209\") " pod="openstack/barbican-db-create-bx54l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.466028 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2w6v\" (UniqueName: \"kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v\") pod \"cinder-db-create-56r7l\" (UID: \"8c8040cf-2183-4b30-9a60-9b630ca829ea\") " pod="openstack/cinder-db-create-56r7l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.497404 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56r7l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.537070 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257n5\" (UniqueName: \"kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5\") pod \"barbican-db-create-bx54l\" (UID: \"b9d404c7-55ef-4ce8-bb45-1a79923b8209\") " pod="openstack/barbican-db-create-bx54l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.568774 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257n5\" (UniqueName: \"kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5\") pod \"barbican-db-create-bx54l\" (UID: \"b9d404c7-55ef-4ce8-bb45-1a79923b8209\") " pod="openstack/barbican-db-create-bx54l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.579281 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jkgbb"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.580478 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jkgbb" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.620008 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bx54l" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.622794 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jkgbb"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.726018 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-695gv"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.726995 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.736534 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.736780 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-md8m4" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.736836 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.738031 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.740249 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vh9\" (UniqueName: \"kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9\") pod \"neutron-db-create-jkgbb\" (UID: \"4039566a-c25d-4fad-b328-11b75d88c287\") " pod="openstack/neutron-db-create-jkgbb" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.753311 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-695gv"] Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.841904 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.841947 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.841997 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8vh9\" (UniqueName: \"kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9\") pod \"neutron-db-create-jkgbb\" (UID: \"4039566a-c25d-4fad-b328-11b75d88c287\") " pod="openstack/neutron-db-create-jkgbb" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.842070 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzht\" (UniqueName: \"kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.888715 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8vh9\" (UniqueName: \"kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9\") pod \"neutron-db-create-jkgbb\" (UID: \"4039566a-c25d-4fad-b328-11b75d88c287\") " pod="openstack/neutron-db-create-jkgbb" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.922311 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jkgbb" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.943437 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.943553 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzht\" (UniqueName: \"kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.943601 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.949270 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.964608 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:55 crc kubenswrapper[4610]: I1006 08:59:55.986574 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzht\" (UniqueName: \"kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht\") pod \"keystone-db-sync-695gv\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " pod="openstack/keystone-db-sync-695gv" Oct 06 08:59:56 crc kubenswrapper[4610]: I1006 08:59:56.045990 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-695gv" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.152033 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2"] Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.153910 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.158755 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.159341 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.173955 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2"] Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.214090 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkgvm\" (UniqueName: \"kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.214184 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.214213 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.318948 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkgvm\" (UniqueName: \"kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.319055 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.319082 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.320613 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.326788 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.350031 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkgvm\" (UniqueName: \"kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm\") pod \"collect-profiles-29329020-cv4m2\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:00 crc kubenswrapper[4610]: I1006 09:00:00.479966 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.353165 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jkgbb"] Oct 06 09:00:02 crc kubenswrapper[4610]: W1006 09:00:02.369303 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4039566a_c25d_4fad_b328_11b75d88c287.slice/crio-1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9 WatchSource:0}: Error finding container 1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9: Status 404 returned error can't find the container with id 1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9 Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.423913 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bx54l"] Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.478377 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2"] Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.483482 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-695gv"] Oct 06 09:00:02 crc kubenswrapper[4610]: W1006 09:00:02.499161 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d404c7_55ef_4ce8_bb45_1a79923b8209.slice/crio-252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be WatchSource:0}: Error finding container 252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be: Status 404 returned error can't find the container with id 252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be Oct 06 09:00:02 crc kubenswrapper[4610]: W1006 09:00:02.506273 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84909105_862a_45a9_b78f_35406a385fa7.slice/crio-ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f WatchSource:0}: Error finding container ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f: Status 404 returned error can't find the container with id ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.608330 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56r7l"] Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.716864 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jkgbb" event={"ID":"4039566a-c25d-4fad-b328-11b75d88c287","Type":"ContainerStarted","Data":"1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.721740 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"05c553c8-ced7-4296-b8c5-12b91a953b1d","Type":"ContainerStarted","Data":"522ef0329359eeb2b7af39e00f088982b5174091c767aa74221d2374c3dc10ac"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.726594 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tqdxt" event={"ID":"ae19c152-10a1-47dd-bb19-c00bf79b56c5","Type":"ContainerStarted","Data":"ba1719a5f9c891597a29cdef91a60d65f74c69861626d3ef8a5507a01c7b27b0"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.734978 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56r7l" event={"ID":"8c8040cf-2183-4b30-9a60-9b630ca829ea","Type":"ContainerStarted","Data":"cd53ad7500c118aca13654900e5e3a3b9bc8b203ff64bfd0f463d217db0a1b7c"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.737027 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-695gv" event={"ID":"84909105-862a-45a9-b78f-35406a385fa7","Type":"ContainerStarted","Data":"ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.737990 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" event={"ID":"85d61c79-991d-4310-be2a-577b04033e43","Type":"ContainerStarted","Data":"cf87711a5cd1a05da1b4ec0cc75d270733549f9f9b1752b12daf61912f82254f"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.740615 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bx54l" event={"ID":"b9d404c7-55ef-4ce8-bb45-1a79923b8209","Type":"ContainerStarted","Data":"252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be"} Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.760075 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=46.778104629 podStartE2EDuration="52.760056278s" podCreationTimestamp="2025-10-06 08:59:10 +0000 UTC" firstStartedPulling="2025-10-06 08:59:44.687695641 +0000 UTC m=+1116.402749019" lastFinishedPulling="2025-10-06 08:59:50.66964727 +0000 UTC m=+1122.384700668" observedRunningTime="2025-10-06 09:00:02.747998068 +0000 UTC m=+1134.463051466" watchObservedRunningTime="2025-10-06 09:00:02.760056278 +0000 UTC m=+1134.475109676" Oct 06 09:00:02 crc kubenswrapper[4610]: I1006 09:00:02.774873 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tqdxt" podStartSLOduration=2.985495635 podStartE2EDuration="19.774851031s" podCreationTimestamp="2025-10-06 08:59:43 +0000 UTC" firstStartedPulling="2025-10-06 08:59:45.06862508 +0000 UTC m=+1116.783678468" lastFinishedPulling="2025-10-06 09:00:01.857980476 +0000 UTC m=+1133.573033864" observedRunningTime="2025-10-06 09:00:02.769640363 +0000 UTC m=+1134.484693751" watchObservedRunningTime="2025-10-06 09:00:02.774851031 +0000 UTC m=+1134.489904419" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.063776 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.065384 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.066965 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.098247 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.190461 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.192112 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhllq\" (UniqueName: \"kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.192333 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.192502 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.192642 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.192813 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.294732 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.295649 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.295768 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.295880 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.296024 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.296190 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhllq\" (UniqueName: \"kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.296585 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.296703 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.296631 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.297279 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.297318 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.319077 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhllq\" (UniqueName: \"kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq\") pod \"dnsmasq-dns-6d5b6d6b67-jk27d\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.476495 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.752994 4610 generic.go:334] "Generic (PLEG): container finished" podID="85d61c79-991d-4310-be2a-577b04033e43" containerID="a72c8fd83747734d80662ad6ce586ffba07a37f7a62f43ce39d6f09c91ce4974" exitCode=0 Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.753317 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" event={"ID":"85d61c79-991d-4310-be2a-577b04033e43","Type":"ContainerDied","Data":"a72c8fd83747734d80662ad6ce586ffba07a37f7a62f43ce39d6f09c91ce4974"} Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.754888 4610 generic.go:334] "Generic (PLEG): container finished" podID="b9d404c7-55ef-4ce8-bb45-1a79923b8209" containerID="6fd837b2ddd9e5f3595d8947e6c3bb3f7b2c576f04794118ccc1b4482ae2b424" exitCode=0 Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.755023 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bx54l" event={"ID":"b9d404c7-55ef-4ce8-bb45-1a79923b8209","Type":"ContainerDied","Data":"6fd837b2ddd9e5f3595d8947e6c3bb3f7b2c576f04794118ccc1b4482ae2b424"} Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.763177 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jkgbb" event={"ID":"4039566a-c25d-4fad-b328-11b75d88c287","Type":"ContainerDied","Data":"c20c5d6fc9be25993dde0e7cd30eeb92b71a0f467af479b17860c0cae2374f5a"} Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.763351 4610 generic.go:334] "Generic (PLEG): container finished" podID="4039566a-c25d-4fad-b328-11b75d88c287" containerID="c20c5d6fc9be25993dde0e7cd30eeb92b71a0f467af479b17860c0cae2374f5a" exitCode=0 Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.781466 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c8040cf-2183-4b30-9a60-9b630ca829ea" containerID="aa4806e25b18e158c02f3180d6b58763ef94b014e53186e02713445e5d98a60a" exitCode=0 Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.781753 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56r7l" event={"ID":"8c8040cf-2183-4b30-9a60-9b630ca829ea","Type":"ContainerDied","Data":"aa4806e25b18e158c02f3180d6b58763ef94b014e53186e02713445e5d98a60a"} Oct 06 09:00:03 crc kubenswrapper[4610]: I1006 09:00:03.923428 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:04 crc kubenswrapper[4610]: I1006 09:00:04.793013 4610 generic.go:334] "Generic (PLEG): container finished" podID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerID="398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284" exitCode=0 Oct 06 09:00:04 crc kubenswrapper[4610]: I1006 09:00:04.793492 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" event={"ID":"6158176e-15d4-416a-aa54-74680f7d7ecf","Type":"ContainerDied","Data":"398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284"} Oct 06 09:00:04 crc kubenswrapper[4610]: I1006 09:00:04.793522 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" event={"ID":"6158176e-15d4-416a-aa54-74680f7d7ecf","Type":"ContainerStarted","Data":"fe8433e940f4243474d2958194ce894004d1cfb1ae1ecd86d9ba2fdc8a7906a9"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.219518 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.337725 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume\") pod \"85d61c79-991d-4310-be2a-577b04033e43\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.337799 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume\") pod \"85d61c79-991d-4310-be2a-577b04033e43\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.337977 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkgvm\" (UniqueName: \"kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm\") pod \"85d61c79-991d-4310-be2a-577b04033e43\" (UID: \"85d61c79-991d-4310-be2a-577b04033e43\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.340373 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume" (OuterVolumeSpecName: "config-volume") pod "85d61c79-991d-4310-be2a-577b04033e43" (UID: "85d61c79-991d-4310-be2a-577b04033e43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.343943 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm" (OuterVolumeSpecName: "kube-api-access-jkgvm") pod "85d61c79-991d-4310-be2a-577b04033e43" (UID: "85d61c79-991d-4310-be2a-577b04033e43"). InnerVolumeSpecName "kube-api-access-jkgvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.344791 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85d61c79-991d-4310-be2a-577b04033e43" (UID: "85d61c79-991d-4310-be2a-577b04033e43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.352398 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bx54l" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.369241 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jkgbb" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.376886 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56r7l" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439137 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2w6v\" (UniqueName: \"kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v\") pod \"8c8040cf-2183-4b30-9a60-9b630ca829ea\" (UID: \"8c8040cf-2183-4b30-9a60-9b630ca829ea\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439246 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257n5\" (UniqueName: \"kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5\") pod \"b9d404c7-55ef-4ce8-bb45-1a79923b8209\" (UID: \"b9d404c7-55ef-4ce8-bb45-1a79923b8209\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439277 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8vh9\" (UniqueName: \"kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9\") pod \"4039566a-c25d-4fad-b328-11b75d88c287\" (UID: \"4039566a-c25d-4fad-b328-11b75d88c287\") " Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439571 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkgvm\" (UniqueName: \"kubernetes.io/projected/85d61c79-991d-4310-be2a-577b04033e43-kube-api-access-jkgvm\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439582 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85d61c79-991d-4310-be2a-577b04033e43-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.439610 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85d61c79-991d-4310-be2a-577b04033e43-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.443341 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9" (OuterVolumeSpecName: "kube-api-access-v8vh9") pod "4039566a-c25d-4fad-b328-11b75d88c287" (UID: "4039566a-c25d-4fad-b328-11b75d88c287"). InnerVolumeSpecName "kube-api-access-v8vh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.443421 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v" (OuterVolumeSpecName: "kube-api-access-c2w6v") pod "8c8040cf-2183-4b30-9a60-9b630ca829ea" (UID: "8c8040cf-2183-4b30-9a60-9b630ca829ea"). InnerVolumeSpecName "kube-api-access-c2w6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.443789 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5" (OuterVolumeSpecName: "kube-api-access-257n5") pod "b9d404c7-55ef-4ce8-bb45-1a79923b8209" (UID: "b9d404c7-55ef-4ce8-bb45-1a79923b8209"). InnerVolumeSpecName "kube-api-access-257n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.541394 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2w6v\" (UniqueName: \"kubernetes.io/projected/8c8040cf-2183-4b30-9a60-9b630ca829ea-kube-api-access-c2w6v\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.541421 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257n5\" (UniqueName: \"kubernetes.io/projected/b9d404c7-55ef-4ce8-bb45-1a79923b8209-kube-api-access-257n5\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.541430 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8vh9\" (UniqueName: \"kubernetes.io/projected/4039566a-c25d-4fad-b328-11b75d88c287-kube-api-access-v8vh9\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.807055 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56r7l" event={"ID":"8c8040cf-2183-4b30-9a60-9b630ca829ea","Type":"ContainerDied","Data":"cd53ad7500c118aca13654900e5e3a3b9bc8b203ff64bfd0f463d217db0a1b7c"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.807097 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd53ad7500c118aca13654900e5e3a3b9bc8b203ff64bfd0f463d217db0a1b7c" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.807111 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56r7l" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.809938 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" event={"ID":"85d61c79-991d-4310-be2a-577b04033e43","Type":"ContainerDied","Data":"cf87711a5cd1a05da1b4ec0cc75d270733549f9f9b1752b12daf61912f82254f"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.809970 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf87711a5cd1a05da1b4ec0cc75d270733549f9f9b1752b12daf61912f82254f" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.810008 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.812323 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bx54l" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.812299 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bx54l" event={"ID":"b9d404c7-55ef-4ce8-bb45-1a79923b8209","Type":"ContainerDied","Data":"252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.812425 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="252d7b359ea7266dba2536840f577ec444d1ad3dde9921f204f03c30b65b37be" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.813915 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jkgbb" event={"ID":"4039566a-c25d-4fad-b328-11b75d88c287","Type":"ContainerDied","Data":"1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.813942 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1536c6b8bcd56d968733511c9888859f8305af5cb91c3fa6a14cd7ea536b4cf9" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.813941 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jkgbb" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.815674 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" event={"ID":"6158176e-15d4-416a-aa54-74680f7d7ecf","Type":"ContainerStarted","Data":"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b"} Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.816785 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:05 crc kubenswrapper[4610]: I1006 09:00:05.840639 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" podStartSLOduration=2.840617807 podStartE2EDuration="2.840617807s" podCreationTimestamp="2025-10-06 09:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:05.834859064 +0000 UTC m=+1137.549912452" watchObservedRunningTime="2025-10-06 09:00:05.840617807 +0000 UTC m=+1137.555671205" Oct 06 09:00:08 crc kubenswrapper[4610]: I1006 09:00:08.845510 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-695gv" event={"ID":"84909105-862a-45a9-b78f-35406a385fa7","Type":"ContainerStarted","Data":"7b9110bbd9f6c2daeaf12f8cde894c577ac67e6f1bab0874600c481ea5e6a446"} Oct 06 09:00:08 crc kubenswrapper[4610]: I1006 09:00:08.863074 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-695gv" podStartSLOduration=8.233497095 podStartE2EDuration="13.863039222s" podCreationTimestamp="2025-10-06 08:59:55 +0000 UTC" firstStartedPulling="2025-10-06 09:00:02.534470866 +0000 UTC m=+1134.249524254" lastFinishedPulling="2025-10-06 09:00:08.164012983 +0000 UTC m=+1139.879066381" observedRunningTime="2025-10-06 09:00:08.861818799 +0000 UTC m=+1140.576872217" watchObservedRunningTime="2025-10-06 09:00:08.863039222 +0000 UTC m=+1140.578092610" Oct 06 09:00:10 crc kubenswrapper[4610]: I1006 09:00:10.862034 4610 generic.go:334] "Generic (PLEG): container finished" podID="ae19c152-10a1-47dd-bb19-c00bf79b56c5" containerID="ba1719a5f9c891597a29cdef91a60d65f74c69861626d3ef8a5507a01c7b27b0" exitCode=0 Oct 06 09:00:10 crc kubenswrapper[4610]: I1006 09:00:10.862108 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tqdxt" event={"ID":"ae19c152-10a1-47dd-bb19-c00bf79b56c5","Type":"ContainerDied","Data":"ba1719a5f9c891597a29cdef91a60d65f74c69861626d3ef8a5507a01c7b27b0"} Oct 06 09:00:11 crc kubenswrapper[4610]: I1006 09:00:11.871924 4610 generic.go:334] "Generic (PLEG): container finished" podID="84909105-862a-45a9-b78f-35406a385fa7" containerID="7b9110bbd9f6c2daeaf12f8cde894c577ac67e6f1bab0874600c481ea5e6a446" exitCode=0 Oct 06 09:00:11 crc kubenswrapper[4610]: I1006 09:00:11.872024 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-695gv" event={"ID":"84909105-862a-45a9-b78f-35406a385fa7","Type":"ContainerDied","Data":"7b9110bbd9f6c2daeaf12f8cde894c577ac67e6f1bab0874600c481ea5e6a446"} Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.249150 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tqdxt" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.355776 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrdmg\" (UniqueName: \"kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg\") pod \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.355965 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data\") pod \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.355990 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle\") pod \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.356103 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data\") pod \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\" (UID: \"ae19c152-10a1-47dd-bb19-c00bf79b56c5\") " Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.363259 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg" (OuterVolumeSpecName: "kube-api-access-zrdmg") pod "ae19c152-10a1-47dd-bb19-c00bf79b56c5" (UID: "ae19c152-10a1-47dd-bb19-c00bf79b56c5"). InnerVolumeSpecName "kube-api-access-zrdmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.363377 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ae19c152-10a1-47dd-bb19-c00bf79b56c5" (UID: "ae19c152-10a1-47dd-bb19-c00bf79b56c5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.385514 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae19c152-10a1-47dd-bb19-c00bf79b56c5" (UID: "ae19c152-10a1-47dd-bb19-c00bf79b56c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.403759 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data" (OuterVolumeSpecName: "config-data") pod "ae19c152-10a1-47dd-bb19-c00bf79b56c5" (UID: "ae19c152-10a1-47dd-bb19-c00bf79b56c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.458148 4610 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.458199 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrdmg\" (UniqueName: \"kubernetes.io/projected/ae19c152-10a1-47dd-bb19-c00bf79b56c5-kube-api-access-zrdmg\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.458210 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.458221 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae19c152-10a1-47dd-bb19-c00bf79b56c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.885597 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tqdxt" event={"ID":"ae19c152-10a1-47dd-bb19-c00bf79b56c5","Type":"ContainerDied","Data":"46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9"} Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.885623 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tqdxt" Oct 06 09:00:12 crc kubenswrapper[4610]: I1006 09:00:12.885651 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b5e690cddd2b407cdc0580001e7533a2f74788afb083a6eeaec77ea9d511b9" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.375510 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.380476 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="dnsmasq-dns" containerID="cri-o://4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b" gracePeriod=10 Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.383262 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.400813 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-695gv" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.445617 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.445961 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8040cf-2183-4b30-9a60-9b630ca829ea" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.445976 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8040cf-2183-4b30-9a60-9b630ca829ea" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.445986 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d61c79-991d-4310-be2a-577b04033e43" containerName="collect-profiles" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.445993 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d61c79-991d-4310-be2a-577b04033e43" containerName="collect-profiles" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.446012 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d404c7-55ef-4ce8-bb45-1a79923b8209" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446019 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d404c7-55ef-4ce8-bb45-1a79923b8209" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.446032 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae19c152-10a1-47dd-bb19-c00bf79b56c5" containerName="glance-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446038 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae19c152-10a1-47dd-bb19-c00bf79b56c5" containerName="glance-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.446077 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84909105-862a-45a9-b78f-35406a385fa7" containerName="keystone-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446083 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="84909105-862a-45a9-b78f-35406a385fa7" containerName="keystone-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.446096 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4039566a-c25d-4fad-b328-11b75d88c287" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446102 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4039566a-c25d-4fad-b328-11b75d88c287" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446240 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4039566a-c25d-4fad-b328-11b75d88c287" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446252 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d404c7-55ef-4ce8-bb45-1a79923b8209" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446270 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae19c152-10a1-47dd-bb19-c00bf79b56c5" containerName="glance-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446279 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d61c79-991d-4310-be2a-577b04033e43" containerName="collect-profiles" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446287 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="84909105-862a-45a9-b78f-35406a385fa7" containerName="keystone-db-sync" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.446299 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8040cf-2183-4b30-9a60-9b630ca829ea" containerName="mariadb-database-create" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.447067 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.473797 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.479202 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481149 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzht\" (UniqueName: \"kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht\") pod \"84909105-862a-45a9-b78f-35406a385fa7\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481277 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data\") pod \"84909105-862a-45a9-b78f-35406a385fa7\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481332 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle\") pod \"84909105-862a-45a9-b78f-35406a385fa7\" (UID: \"84909105-862a-45a9-b78f-35406a385fa7\") " Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481541 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481606 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481625 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481659 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5qp\" (UniqueName: \"kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481694 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.481722 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.504764 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht" (OuterVolumeSpecName: "kube-api-access-qfzht") pod "84909105-862a-45a9-b78f-35406a385fa7" (UID: "84909105-862a-45a9-b78f-35406a385fa7"). InnerVolumeSpecName "kube-api-access-qfzht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.588609 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5qp\" (UniqueName: \"kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.588695 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.588758 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.588862 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.588943 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589367 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589853 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589926 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589968 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84909105-862a-45a9-b78f-35406a385fa7" (UID: "84909105-862a-45a9-b78f-35406a385fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589987 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzht\" (UniqueName: \"kubernetes.io/projected/84909105-862a-45a9-b78f-35406a385fa7-kube-api-access-qfzht\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.589988 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.590244 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.591939 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.606477 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5qp\" (UniqueName: \"kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp\") pod \"dnsmasq-dns-895cf5cf-w42gj\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.624776 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data" (OuterVolumeSpecName: "config-data") pod "84909105-862a-45a9-b78f-35406a385fa7" (UID: "84909105-862a-45a9-b78f-35406a385fa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.691956 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.691984 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84909105-862a-45a9-b78f-35406a385fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.774411 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.824653 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.903813 4610 generic.go:334] "Generic (PLEG): container finished" podID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerID="4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b" exitCode=0 Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.903877 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" event={"ID":"6158176e-15d4-416a-aa54-74680f7d7ecf","Type":"ContainerDied","Data":"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b"} Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.903904 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" event={"ID":"6158176e-15d4-416a-aa54-74680f7d7ecf","Type":"ContainerDied","Data":"fe8433e940f4243474d2958194ce894004d1cfb1ae1ecd86d9ba2fdc8a7906a9"} Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.903919 4610 scope.go:117] "RemoveContainer" containerID="4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.904032 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-jk27d" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.913488 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-695gv" event={"ID":"84909105-862a-45a9-b78f-35406a385fa7","Type":"ContainerDied","Data":"ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f"} Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.913528 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab36b443396dcb4ca958f62a18adaa114de9ea9fc064ef8284506ccf5a5b736f" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.913605 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-695gv" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.928993 4610 scope.go:117] "RemoveContainer" containerID="398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.994657 4610 scope.go:117] "RemoveContainer" containerID="4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.995393 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b\": container with ID starting with 4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b not found: ID does not exist" containerID="4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.995443 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b"} err="failed to get container status \"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b\": rpc error: code = NotFound desc = could not find container \"4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b\": container with ID starting with 4afb77ad6d251ad75ead6dd5d7fab2dc83ce80bb24a2661db77bec9369e6837b not found: ID does not exist" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.995481 4610 scope.go:117] "RemoveContainer" containerID="398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284" Oct 06 09:00:13 crc kubenswrapper[4610]: E1006 09:00:13.995908 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284\": container with ID starting with 398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284 not found: ID does not exist" containerID="398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.995937 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284"} err="failed to get container status \"398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284\": rpc error: code = NotFound desc = could not find container \"398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284\": container with ID starting with 398e04feb80371e3dc4e8d5efa4a5f3e39b705456eeb9fea2ad8f78376785284 not found: ID does not exist" Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.997620 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.997759 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:13 crc kubenswrapper[4610]: I1006 09:00:13.997792 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:13.998530 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhllq\" (UniqueName: \"kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:13.998589 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:13.998677 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb\") pod \"6158176e-15d4-416a-aa54-74680f7d7ecf\" (UID: \"6158176e-15d4-416a-aa54-74680f7d7ecf\") " Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.007701 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq" (OuterVolumeSpecName: "kube-api-access-hhllq") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "kube-api-access-hhllq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.102866 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.103280 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config" (OuterVolumeSpecName: "config") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.110109 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.110140 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhllq\" (UniqueName: \"kubernetes.io/projected/6158176e-15d4-416a-aa54-74680f7d7ecf-kube-api-access-hhllq\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.110152 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.116442 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.125396 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.141002 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.142579 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6158176e-15d4-416a-aa54-74680f7d7ecf" (UID: "6158176e-15d4-416a-aa54-74680f7d7ecf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.151238 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rf2js"] Oct 06 09:00:14 crc kubenswrapper[4610]: E1006 09:00:14.151557 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="init" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.151568 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="init" Oct 06 09:00:14 crc kubenswrapper[4610]: E1006 09:00:14.151598 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="dnsmasq-dns" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.151603 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="dnsmasq-dns" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.151753 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" containerName="dnsmasq-dns" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.152279 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.168890 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-md8m4" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.169130 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.169218 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.169310 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.172354 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rf2js"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.178961 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.211674 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.211780 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.211807 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtcj\" (UniqueName: \"kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.211887 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.211937 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.212000 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.212660 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.213102 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.213170 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6158176e-15d4-416a-aa54-74680f7d7ecf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.245719 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.247294 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315525 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315590 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315647 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315688 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315712 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.315744 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vtcj\" (UniqueName: \"kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.343207 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.353738 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.375474 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vtcj\" (UniqueName: \"kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.375739 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.388226 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.388822 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys\") pod \"keystone-bootstrap-rf2js\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.398232 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.423200 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-jk27d"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426111 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnlb\" (UniqueName: \"kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426168 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426213 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426233 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426293 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.426324 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.452752 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.507280 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.513538 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.524183 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.525397 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.526879 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.531715 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.532033 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jsnzd" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533309 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533383 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533486 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnlb\" (UniqueName: \"kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533538 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533576 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.533626 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.534886 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.535203 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.535420 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.535469 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.536326 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.536435 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.546060 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.549245 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.557560 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.569559 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.576008 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnlb\" (UniqueName: \"kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb\") pod \"dnsmasq-dns-6c9c9f998c-p7dzc\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.585677 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.644508 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645346 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645372 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645412 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645489 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645510 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645536 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwk7\" (UniqueName: \"kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645557 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645588 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645613 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645645 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645670 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzgcf\" (UniqueName: \"kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.645688 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752290 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752496 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752571 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752661 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwk7\" (UniqueName: \"kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752730 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752855 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752941 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.752984 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zx98b"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.753103 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.753174 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzgcf\" (UniqueName: \"kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.753251 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.753392 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.753465 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.754062 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.754821 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.767026 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.767616 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.767854 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.768167 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.770688 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.772275 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.773823 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.780424 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.780630 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.780834 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qlrf5" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.781036 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.782865 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.825396 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zx98b"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.833979 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzgcf\" (UniqueName: \"kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf\") pod \"ceilometer-0\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " pod="openstack/ceilometer-0" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.834370 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwk7\" (UniqueName: \"kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7\") pod \"horizon-5cd9db456f-6rdvm\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.837479 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.865622 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:00:14 crc kubenswrapper[4610]: I1006 09:00:14.866893 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.931349 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.958416 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.971926 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.971971 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nw6\" (UniqueName: \"kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972006 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972021 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972041 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972070 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972087 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972113 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdrh\" (UniqueName: \"kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972151 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972168 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.972192 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:14.992847 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.008540 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" podUID="7ab21033-7412-4ece-9531-66a65db3f0ab" containerName="init" containerID="cri-o://3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609" gracePeriod=10 Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.009480 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.017294 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" event={"ID":"7ab21033-7412-4ece-9531-66a65db3f0ab","Type":"ContainerStarted","Data":"3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609"} Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.017343 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" event={"ID":"7ab21033-7412-4ece-9531-66a65db3f0ab","Type":"ContainerStarted","Data":"96009af340dac556b2b35cd953243e7c6f161b7fd08b94cad26f48515307fc24"} Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.017424 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.049684 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.049886 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-59rpw" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.049989 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.061109 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.062970 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087078 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdrh\" (UniqueName: \"kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087153 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087181 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087227 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087287 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087317 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nw6\" (UniqueName: \"kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087357 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087377 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087402 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087426 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.087446 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.089189 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.095222 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.101338 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.114200 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.114915 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.118639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.149393 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.159339 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nw6\" (UniqueName: \"kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6\") pod \"dnsmasq-dns-57c957c4ff-lrn2q\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188307 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188349 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188372 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188388 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpwdp\" (UniqueName: \"kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188451 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188481 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188524 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188548 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjr7g\" (UniqueName: \"kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188580 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188594 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188615 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.188632 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.204265 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdrh\" (UniqueName: \"kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.215387 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.215813 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle\") pod \"placement-db-sync-zx98b\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295068 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295106 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjr7g\" (UniqueName: \"kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295170 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295186 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295207 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295224 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295246 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295266 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295282 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295299 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpwdp\" (UniqueName: \"kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295350 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295378 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.295721 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.296842 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.297052 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.298807 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.302918 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.303192 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.324581 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.326683 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.327430 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.332223 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.335642 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpwdp\" (UniqueName: \"kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp\") pod \"horizon-86bb58785f-lmpd2\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.350302 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjr7g\" (UniqueName: \"kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.392545 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.397065 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zx98b" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.424661 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6158176e-15d4-416a-aa54-74680f7d7ecf" path="/var/lib/kubelet/pods/6158176e-15d4-416a-aa54-74680f7d7ecf/volumes" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.425490 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.425518 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.425533 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.435110 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.435148 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cdf4-account-create-hfhbl"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.435437 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.440027 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cdf4-account-create-hfhbl"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.440070 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ac63-account-create-4btw5"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.440556 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.441068 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.442690 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ac63-account-create-4btw5"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.442871 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.447516 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.478429 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.583157 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.583957 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.599940 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626621 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hwq\" (UniqueName: \"kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626706 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626743 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncc8x\" (UniqueName: \"kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x\") pod \"barbican-cdf4-account-create-hfhbl\" (UID: \"45050b7d-e181-4561-91ae-4ba8897b9daf\") " pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626771 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626796 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626906 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6757x\" (UniqueName: \"kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x\") pod \"cinder-ac63-account-create-4btw5\" (UID: \"a181b0d5-7688-4cde-be34-8b7108abf09b\") " pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626920 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626961 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.626980 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.726267 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e32f-account-create-489xl"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728602 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncc8x\" (UniqueName: \"kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x\") pod \"barbican-cdf4-account-create-hfhbl\" (UID: \"45050b7d-e181-4561-91ae-4ba8897b9daf\") " pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728635 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728671 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728696 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728749 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6757x\" (UniqueName: \"kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x\") pod \"cinder-ac63-account-create-4btw5\" (UID: \"a181b0d5-7688-4cde-be34-8b7108abf09b\") " pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728790 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728830 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728853 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.728896 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hwq\" (UniqueName: \"kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.730526 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.733780 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.741317 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.742234 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.749326 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e32f-account-create-489xl"] Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.754189 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.786293 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.807789 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6757x\" (UniqueName: \"kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x\") pod \"cinder-ac63-account-create-4btw5\" (UID: \"a181b0d5-7688-4cde-be34-8b7108abf09b\") " pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.808413 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncc8x\" (UniqueName: \"kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x\") pod \"barbican-cdf4-account-create-hfhbl\" (UID: \"45050b7d-e181-4561-91ae-4ba8897b9daf\") " pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.822812 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hwq\" (UniqueName: \"kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.832944 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qz7p\" (UniqueName: \"kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p\") pod \"neutron-e32f-account-create-489xl\" (UID: \"b1274a89-0f59-489d-a02f-d08cc9513c2d\") " pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.845172 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.928802 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.948353 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qz7p\" (UniqueName: \"kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p\") pod \"neutron-e32f-account-create-489xl\" (UID: \"b1274a89-0f59-489d-a02f-d08cc9513c2d\") " pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.975550 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:15 crc kubenswrapper[4610]: E1006 09:00:15.983927 4610 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab21033_7412_4ece_9531_66a65db3f0ab.slice/crio-conmon-3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab21033_7412_4ece_9531_66a65db3f0ab.slice/crio-3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.987516 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qz7p\" (UniqueName: \"kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p\") pod \"neutron-e32f-account-create-489xl\" (UID: \"b1274a89-0f59-489d-a02f-d08cc9513c2d\") " pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:15 crc kubenswrapper[4610]: I1006 09:00:15.997764 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.011924 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rf2js"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.015474 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.096307 4610 generic.go:334] "Generic (PLEG): container finished" podID="7ab21033-7412-4ece-9531-66a65db3f0ab" containerID="3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609" exitCode=0 Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.096367 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" event={"ID":"7ab21033-7412-4ece-9531-66a65db3f0ab","Type":"ContainerDied","Data":"3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609"} Oct 06 09:00:16 crc kubenswrapper[4610]: W1006 09:00:16.096475 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac78470f_702c_4fa4_a521_2deddbdb6e51.slice/crio-5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029 WatchSource:0}: Error finding container 5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029: Status 404 returned error can't find the container with id 5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029 Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.211757 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.258219 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.368833 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371496 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371544 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371634 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371733 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371788 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj5qp\" (UniqueName: \"kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.371828 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0\") pod \"7ab21033-7412-4ece-9531-66a65db3f0ab\" (UID: \"7ab21033-7412-4ece-9531-66a65db3f0ab\") " Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.397148 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp" (OuterVolumeSpecName: "kube-api-access-kj5qp") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "kube-api-access-kj5qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.401271 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.401520 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj5qp\" (UniqueName: \"kubernetes.io/projected/7ab21033-7412-4ece-9531-66a65db3f0ab-kube-api-access-kj5qp\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: W1006 09:00:16.408251 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcef8c381_61e9_4b18_abe5_d657d9885979.slice/crio-4c0a443b48154b7bd5dc05c09dd3544c0519842d515a8704dc59bc88031cd6c1 WatchSource:0}: Error finding container 4c0a443b48154b7bd5dc05c09dd3544c0519842d515a8704dc59bc88031cd6c1: Status 404 returned error can't find the container with id 4c0a443b48154b7bd5dc05c09dd3544c0519842d515a8704dc59bc88031cd6c1 Oct 06 09:00:16 crc kubenswrapper[4610]: W1006 09:00:16.408660 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod260e5e02_82c5_4420_99b7_903beec62a86.slice/crio-e0427aef5df8bb0a9a580e99f62850e6aaacc31084f6bf27d4ea5fceaa88f083 WatchSource:0}: Error finding container e0427aef5df8bb0a9a580e99f62850e6aaacc31084f6bf27d4ea5fceaa88f083: Status 404 returned error can't find the container with id e0427aef5df8bb0a9a580e99f62850e6aaacc31084f6bf27d4ea5fceaa88f083 Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.430730 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config" (OuterVolumeSpecName: "config") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.469977 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.470295 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.509591 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.537683 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.557413 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.561422 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.566206 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ab21033-7412-4ece-9531-66a65db3f0ab" (UID: "7ab21033-7412-4ece-9531-66a65db3f0ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.618104 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.618144 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.618157 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.618168 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ab21033-7412-4ece-9531-66a65db3f0ab-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.873856 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.895425 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.961248 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zx98b"] Oct 06 09:00:16 crc kubenswrapper[4610]: I1006 09:00:16.970727 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.158992 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e32f-account-create-489xl"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.167351 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" event={"ID":"7ab21033-7412-4ece-9531-66a65db3f0ab","Type":"ContainerDied","Data":"96009af340dac556b2b35cd953243e7c6f161b7fd08b94cad26f48515307fc24"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.167448 4610 scope.go:117] "RemoveContainer" containerID="3cd644e05afd9735fea68cd68c2e84f7f6b2c729910cbc37e6885003b8904609" Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.178348 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd9db456f-6rdvm" event={"ID":"430ee76b-f17b-4059-bb2e-5f87cf6016d6","Type":"ContainerStarted","Data":"35db6e06f197d26d5a919bc529fb90d3650812d82938d825fe78aa6babcf13dc"} Oct 06 09:00:17 crc kubenswrapper[4610]: W1006 09:00:17.189898 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1df847_3ff6_4009_9707_a5cacd51d067.slice/crio-f3df5597bdd5eed9f3a0cfbdada29e62771f16fe610feafc7f5933ed0671ebfe WatchSource:0}: Error finding container f3df5597bdd5eed9f3a0cfbdada29e62771f16fe610feafc7f5933ed0671ebfe: Status 404 returned error can't find the container with id f3df5597bdd5eed9f3a0cfbdada29e62771f16fe610feafc7f5933ed0671ebfe Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.190277 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerStarted","Data":"4c0a443b48154b7bd5dc05c09dd3544c0519842d515a8704dc59bc88031cd6c1"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.203824 4610 generic.go:334] "Generic (PLEG): container finished" podID="260e5e02-82c5-4420-99b7-903beec62a86" containerID="42362327cf22c170dc105ef63e2e580d28f8cf1608eff7aaf4490b4cb1f9bfef" exitCode=0 Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.203986 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" event={"ID":"260e5e02-82c5-4420-99b7-903beec62a86","Type":"ContainerDied","Data":"42362327cf22c170dc105ef63e2e580d28f8cf1608eff7aaf4490b4cb1f9bfef"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.204081 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" event={"ID":"260e5e02-82c5-4420-99b7-903beec62a86","Type":"ContainerStarted","Data":"e0427aef5df8bb0a9a580e99f62850e6aaacc31084f6bf27d4ea5fceaa88f083"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.207236 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ac63-account-create-4btw5"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.212065 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-w42gj" Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.257506 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86bb58785f-lmpd2" event={"ID":"8dff6a2f-b759-443f-8c8f-50af38096244","Type":"ContainerStarted","Data":"e2330802b23936adb07bd059da3efa1e1382259fefde1e0e7ba966becd70eac5"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.276144 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zx98b" event={"ID":"72ba2911-ba6a-40d2-b05e-011016c788c4","Type":"ContainerStarted","Data":"41849fc2fa734b46341e322265a4ee05e7d9354e30a45f7ea0cc264f54603bea"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.286538 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" event={"ID":"c5571882-ca5c-46a2-9598-377e7c779036","Type":"ContainerStarted","Data":"26395376bb0ec542afde54f1d520810d881dea2483e19863b19a776dea64162e"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.290497 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rf2js" event={"ID":"ac78470f-702c-4fa4-a521-2deddbdb6e51","Type":"ContainerStarted","Data":"882160670d83072988d19d3d783db490deed5d5939efd6625ed8a513ef1389ba"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.290542 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rf2js" event={"ID":"ac78470f-702c-4fa4-a521-2deddbdb6e51","Type":"ContainerStarted","Data":"5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029"} Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.356646 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.450995 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.507146 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cdf4-account-create-hfhbl"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.525600 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.558951 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-w42gj"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.559522 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rf2js" podStartSLOduration=3.559511357 podStartE2EDuration="3.559511357s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:17.377578344 +0000 UTC m=+1149.092631732" watchObservedRunningTime="2025-10-06 09:00:17.559511357 +0000 UTC m=+1149.274564745" Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.829265 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.945783 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:17 crc kubenswrapper[4610]: I1006 09:00:17.979487 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.003078 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:18 crc kubenswrapper[4610]: E1006 09:00:18.003476 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260e5e02-82c5-4420-99b7-903beec62a86" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.003493 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="260e5e02-82c5-4420-99b7-903beec62a86" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: E1006 09:00:18.003505 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab21033-7412-4ece-9531-66a65db3f0ab" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.003517 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab21033-7412-4ece-9531-66a65db3f0ab" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.003695 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab21033-7412-4ece-9531-66a65db3f0ab" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.003710 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="260e5e02-82c5-4420-99b7-903beec62a86" containerName="init" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.004738 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013549 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013646 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013726 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013784 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013854 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vnlb\" (UniqueName: \"kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.013939 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb\") pod \"260e5e02-82c5-4420-99b7-903beec62a86\" (UID: \"260e5e02-82c5-4420-99b7-903beec62a86\") " Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.078449 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb" (OuterVolumeSpecName: "kube-api-access-5vnlb") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "kube-api-access-5vnlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.103701 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.119101 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.119356 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.119451 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrnr\" (UniqueName: \"kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.119576 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.120203 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.120376 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vnlb\" (UniqueName: \"kubernetes.io/projected/260e5e02-82c5-4420-99b7-903beec62a86-kube-api-access-5vnlb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.127878 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.146904 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.156160 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.163305 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.181009 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.194433 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.194594 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config" (OuterVolumeSpecName: "config") pod "260e5e02-82c5-4420-99b7-903beec62a86" (UID: "260e5e02-82c5-4420-99b7-903beec62a86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.226752 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.226834 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.226910 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.226946 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrnr\" (UniqueName: \"kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.227015 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.227594 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.227951 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.229037 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.230510 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.230527 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.230536 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.230546 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.230554 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/260e5e02-82c5-4420-99b7-903beec62a86-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.236540 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.266911 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrnr\" (UniqueName: \"kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr\") pod \"horizon-6f7968cb5c-gnj87\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.381687 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.386612 4610 generic.go:334] "Generic (PLEG): container finished" podID="c5571882-ca5c-46a2-9598-377e7c779036" containerID="197833cb4fe91c18b70378d742c893f770918d4a189d37611bc9c02c2daa2ba2" exitCode=0 Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.386696 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" event={"ID":"c5571882-ca5c-46a2-9598-377e7c779036","Type":"ContainerDied","Data":"197833cb4fe91c18b70378d742c893f770918d4a189d37611bc9c02c2daa2ba2"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.419488 4610 generic.go:334] "Generic (PLEG): container finished" podID="b1274a89-0f59-489d-a02f-d08cc9513c2d" containerID="cd989300cbe2b04dfc331e821ba340e62171907ab6709eecb44068e4ea383c98" exitCode=0 Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.419576 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e32f-account-create-489xl" event={"ID":"b1274a89-0f59-489d-a02f-d08cc9513c2d","Type":"ContainerDied","Data":"cd989300cbe2b04dfc331e821ba340e62171907ab6709eecb44068e4ea383c98"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.419603 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e32f-account-create-489xl" event={"ID":"b1274a89-0f59-489d-a02f-d08cc9513c2d","Type":"ContainerStarted","Data":"c0ee02fbfb9110dc76942f2f61eec5e7d529642f33bd5e23200231547b6f4ccf"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.438579 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerStarted","Data":"3c2351031f77ef230c4dddf19a7b0952e2a8dcb676e662784ea6f761579f9e87"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.440578 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerStarted","Data":"f3df5597bdd5eed9f3a0cfbdada29e62771f16fe610feafc7f5933ed0671ebfe"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.448934 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ac63-account-create-4btw5" event={"ID":"a181b0d5-7688-4cde-be34-8b7108abf09b","Type":"ContainerStarted","Data":"5c48930a935627d2b67b8cc77df60f8cf3936c3af19624399f4d0c1c514d206d"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.448996 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ac63-account-create-4btw5" event={"ID":"a181b0d5-7688-4cde-be34-8b7108abf09b","Type":"ContainerStarted","Data":"612c2beec2d5dbac5014f46abaa692419cd52acd7408c34584812b3bc58869e6"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.458866 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" event={"ID":"260e5e02-82c5-4420-99b7-903beec62a86","Type":"ContainerDied","Data":"e0427aef5df8bb0a9a580e99f62850e6aaacc31084f6bf27d4ea5fceaa88f083"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.458919 4610 scope.go:117] "RemoveContainer" containerID="42362327cf22c170dc105ef63e2e580d28f8cf1608eff7aaf4490b4cb1f9bfef" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.459012 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-p7dzc" Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.470139 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cdf4-account-create-hfhbl" event={"ID":"45050b7d-e181-4561-91ae-4ba8897b9daf","Type":"ContainerStarted","Data":"8fc11d620c32dc677ba2489f9a9dec3111fa353c90450c26a863454a5386aada"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.470175 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cdf4-account-create-hfhbl" event={"ID":"45050b7d-e181-4561-91ae-4ba8897b9daf","Type":"ContainerStarted","Data":"1e6cb16b5681bcdb6fa991ffb2618b3182013f1722e87f39042a26411a60c374"} Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.760888 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:18 crc kubenswrapper[4610]: I1006 09:00:18.767883 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-p7dzc"] Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.148209 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260e5e02-82c5-4420-99b7-903beec62a86" path="/var/lib/kubelet/pods/260e5e02-82c5-4420-99b7-903beec62a86/volumes" Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.149506 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab21033-7412-4ece-9531-66a65db3f0ab" path="/var/lib/kubelet/pods/7ab21033-7412-4ece-9531-66a65db3f0ab/volumes" Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.159221 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.493173 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" event={"ID":"c5571882-ca5c-46a2-9598-377e7c779036","Type":"ContainerStarted","Data":"c3d7947c6c4d4523abf33e6ffa5955cea85b7a462b1750f1eed9df7592f2c7e1"} Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.494859 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.517842 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f7968cb5c-gnj87" event={"ID":"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e","Type":"ContainerStarted","Data":"52fb2d9e9088dabde142b4325c07e771887fa2a9ffbe9e34cb031c77fdc020c6"} Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.529533 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" podStartSLOduration=5.529517476 podStartE2EDuration="5.529517476s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:19.527224245 +0000 UTC m=+1151.242277643" watchObservedRunningTime="2025-10-06 09:00:19.529517476 +0000 UTC m=+1151.244570864" Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.531959 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerStarted","Data":"f68bc1aa2035aad5d0a37b4d94eca0d99f95e1272057d867be13a66ea9a108f2"} Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.539161 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerStarted","Data":"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14"} Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.541287 4610 generic.go:334] "Generic (PLEG): container finished" podID="a181b0d5-7688-4cde-be34-8b7108abf09b" containerID="5c48930a935627d2b67b8cc77df60f8cf3936c3af19624399f4d0c1c514d206d" exitCode=0 Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.541331 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ac63-account-create-4btw5" event={"ID":"a181b0d5-7688-4cde-be34-8b7108abf09b","Type":"ContainerDied","Data":"5c48930a935627d2b67b8cc77df60f8cf3936c3af19624399f4d0c1c514d206d"} Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.552125 4610 generic.go:334] "Generic (PLEG): container finished" podID="45050b7d-e181-4561-91ae-4ba8897b9daf" containerID="8fc11d620c32dc677ba2489f9a9dec3111fa353c90450c26a863454a5386aada" exitCode=0 Oct 06 09:00:19 crc kubenswrapper[4610]: I1006 09:00:19.552542 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cdf4-account-create-hfhbl" event={"ID":"45050b7d-e181-4561-91ae-4ba8897b9daf","Type":"ContainerDied","Data":"8fc11d620c32dc677ba2489f9a9dec3111fa353c90450c26a863454a5386aada"} Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.038623 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.085831 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6757x\" (UniqueName: \"kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x\") pod \"a181b0d5-7688-4cde-be34-8b7108abf09b\" (UID: \"a181b0d5-7688-4cde-be34-8b7108abf09b\") " Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.096234 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x" (OuterVolumeSpecName: "kube-api-access-6757x") pod "a181b0d5-7688-4cde-be34-8b7108abf09b" (UID: "a181b0d5-7688-4cde-be34-8b7108abf09b"). InnerVolumeSpecName "kube-api-access-6757x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.103282 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.187680 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.189568 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qz7p\" (UniqueName: \"kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p\") pod \"b1274a89-0f59-489d-a02f-d08cc9513c2d\" (UID: \"b1274a89-0f59-489d-a02f-d08cc9513c2d\") " Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.190521 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6757x\" (UniqueName: \"kubernetes.io/projected/a181b0d5-7688-4cde-be34-8b7108abf09b-kube-api-access-6757x\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.206199 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p" (OuterVolumeSpecName: "kube-api-access-7qz7p") pod "b1274a89-0f59-489d-a02f-d08cc9513c2d" (UID: "b1274a89-0f59-489d-a02f-d08cc9513c2d"). InnerVolumeSpecName "kube-api-access-7qz7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.292028 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncc8x\" (UniqueName: \"kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x\") pod \"45050b7d-e181-4561-91ae-4ba8897b9daf\" (UID: \"45050b7d-e181-4561-91ae-4ba8897b9daf\") " Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.292425 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qz7p\" (UniqueName: \"kubernetes.io/projected/b1274a89-0f59-489d-a02f-d08cc9513c2d-kube-api-access-7qz7p\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.297582 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x" (OuterVolumeSpecName: "kube-api-access-ncc8x") pod "45050b7d-e181-4561-91ae-4ba8897b9daf" (UID: "45050b7d-e181-4561-91ae-4ba8897b9daf"). InnerVolumeSpecName "kube-api-access-ncc8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.394458 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncc8x\" (UniqueName: \"kubernetes.io/projected/45050b7d-e181-4561-91ae-4ba8897b9daf-kube-api-access-ncc8x\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.598308 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e32f-account-create-489xl" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.599768 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e32f-account-create-489xl" event={"ID":"b1274a89-0f59-489d-a02f-d08cc9513c2d","Type":"ContainerDied","Data":"c0ee02fbfb9110dc76942f2f61eec5e7d529642f33bd5e23200231547b6f4ccf"} Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.599812 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ee02fbfb9110dc76942f2f61eec5e7d529642f33bd5e23200231547b6f4ccf" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.607930 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerStarted","Data":"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5"} Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.608170 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-log" containerID="cri-o://c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" gracePeriod=30 Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.608667 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-httpd" containerID="cri-o://cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" gracePeriod=30 Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.616839 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ac63-account-create-4btw5" event={"ID":"a181b0d5-7688-4cde-be34-8b7108abf09b","Type":"ContainerDied","Data":"612c2beec2d5dbac5014f46abaa692419cd52acd7408c34584812b3bc58869e6"} Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.616880 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612c2beec2d5dbac5014f46abaa692419cd52acd7408c34584812b3bc58869e6" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.616959 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ac63-account-create-4btw5" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.633450 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.63343353 podStartE2EDuration="6.63343353s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:20.631027616 +0000 UTC m=+1152.346081014" watchObservedRunningTime="2025-10-06 09:00:20.63343353 +0000 UTC m=+1152.348486918" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.650949 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cdf4-account-create-hfhbl" Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.651895 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cdf4-account-create-hfhbl" event={"ID":"45050b7d-e181-4561-91ae-4ba8897b9daf","Type":"ContainerDied","Data":"1e6cb16b5681bcdb6fa991ffb2618b3182013f1722e87f39042a26411a60c374"} Oct 06 09:00:20 crc kubenswrapper[4610]: I1006 09:00:20.651932 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6cb16b5681bcdb6fa991ffb2618b3182013f1722e87f39042a26411a60c374" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.341953 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420002 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420123 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjr7g\" (UniqueName: \"kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420181 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420291 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420318 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420339 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420397 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"6a1df847-3ff6-4009-9707-a5cacd51d067\" (UID: \"6a1df847-3ff6-4009-9707-a5cacd51d067\") " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.420867 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.421194 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs" (OuterVolumeSpecName: "logs") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.421283 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.427260 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.428855 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts" (OuterVolumeSpecName: "scripts") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.428986 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g" (OuterVolumeSpecName: "kube-api-access-sjr7g") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "kube-api-access-sjr7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.452470 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.482469 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data" (OuterVolumeSpecName: "config-data") pod "6a1df847-3ff6-4009-9707-a5cacd51d067" (UID: "6a1df847-3ff6-4009-9707-a5cacd51d067"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522476 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522541 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522553 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjr7g\" (UniqueName: \"kubernetes.io/projected/6a1df847-3ff6-4009-9707-a5cacd51d067-kube-api-access-sjr7g\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522563 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522571 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1df847-3ff6-4009-9707-a5cacd51d067-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.522598 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1df847-3ff6-4009-9707-a5cacd51d067-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.546784 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.624230 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.666347 4610 generic.go:334] "Generic (PLEG): container finished" podID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerID="cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" exitCode=0 Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.666378 4610 generic.go:334] "Generic (PLEG): container finished" podID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerID="c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" exitCode=143 Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.666459 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.667265 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerDied","Data":"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5"} Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.667372 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerDied","Data":"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14"} Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.667414 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a1df847-3ff6-4009-9707-a5cacd51d067","Type":"ContainerDied","Data":"f3df5597bdd5eed9f3a0cfbdada29e62771f16fe610feafc7f5933ed0671ebfe"} Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.667436 4610 scope.go:117] "RemoveContainer" containerID="cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.673732 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerStarted","Data":"0aacb6a9d515466a7c32193f976c085d73f1fabeac5506f7802ba89e4c253601"} Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.675328 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-log" containerID="cri-o://f68bc1aa2035aad5d0a37b4d94eca0d99f95e1272057d867be13a66ea9a108f2" gracePeriod=30 Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.675713 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-httpd" containerID="cri-o://0aacb6a9d515466a7c32193f976c085d73f1fabeac5506f7802ba89e4c253601" gracePeriod=30 Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.701352 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.701337217 podStartE2EDuration="6.701337217s" podCreationTimestamp="2025-10-06 09:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:21.700822723 +0000 UTC m=+1153.415876111" watchObservedRunningTime="2025-10-06 09:00:21.701337217 +0000 UTC m=+1153.416390605" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.741675 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.756547 4610 scope.go:117] "RemoveContainer" containerID="c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.756671 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770119 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.770675 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-httpd" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770692 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-httpd" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.770711 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a181b0d5-7688-4cde-be34-8b7108abf09b" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770718 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a181b0d5-7688-4cde-be34-8b7108abf09b" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.770731 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45050b7d-e181-4561-91ae-4ba8897b9daf" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770740 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="45050b7d-e181-4561-91ae-4ba8897b9daf" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.770751 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1274a89-0f59-489d-a02f-d08cc9513c2d" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770758 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1274a89-0f59-489d-a02f-d08cc9513c2d" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.770778 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-log" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770785 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-log" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770979 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-log" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.770994 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a181b0d5-7688-4cde-be34-8b7108abf09b" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.771011 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="45050b7d-e181-4561-91ae-4ba8897b9daf" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.771028 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" containerName="glance-httpd" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.771036 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1274a89-0f59-489d-a02f-d08cc9513c2d" containerName="mariadb-account-create" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.772258 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.776223 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.784411 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.832827 4610 scope.go:117] "RemoveContainer" containerID="cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.835516 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5\": container with ID starting with cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5 not found: ID does not exist" containerID="cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.835558 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5"} err="failed to get container status \"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5\": rpc error: code = NotFound desc = could not find container \"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5\": container with ID starting with cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5 not found: ID does not exist" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.835584 4610 scope.go:117] "RemoveContainer" containerID="c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" Oct 06 09:00:21 crc kubenswrapper[4610]: E1006 09:00:21.836451 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14\": container with ID starting with c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14 not found: ID does not exist" containerID="c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.836484 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14"} err="failed to get container status \"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14\": rpc error: code = NotFound desc = could not find container \"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14\": container with ID starting with c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14 not found: ID does not exist" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.836539 4610 scope.go:117] "RemoveContainer" containerID="cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.836927 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5"} err="failed to get container status \"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5\": rpc error: code = NotFound desc = could not find container \"cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5\": container with ID starting with cc25c1395faf3938f7eb3b49714f4512d0bcc3b9e27a58120df1c59194edd0e5 not found: ID does not exist" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.836951 4610 scope.go:117] "RemoveContainer" containerID="c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.841882 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14"} err="failed to get container status \"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14\": rpc error: code = NotFound desc = could not find container \"c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14\": container with ID starting with c42dfe491763763ab91e58f7ccce6cfb2e354a21227798d07eb3de0d1c64da14 not found: ID does not exist" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.927932 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.928009 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.928039 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnsdt\" (UniqueName: \"kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.929044 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.929108 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.929147 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:21 crc kubenswrapper[4610]: I1006 09:00:21.929174 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030657 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030715 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030740 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030772 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030815 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnsdt\" (UniqueName: \"kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.030885 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.031237 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.031905 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.032101 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.041728 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.046493 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.046831 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.059708 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnsdt\" (UniqueName: \"kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.086846 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.105435 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.690445 4610 generic.go:334] "Generic (PLEG): container finished" podID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerID="0aacb6a9d515466a7c32193f976c085d73f1fabeac5506f7802ba89e4c253601" exitCode=0 Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.690997 4610 generic.go:334] "Generic (PLEG): container finished" podID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerID="f68bc1aa2035aad5d0a37b4d94eca0d99f95e1272057d867be13a66ea9a108f2" exitCode=143 Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.690551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerDied","Data":"0aacb6a9d515466a7c32193f976c085d73f1fabeac5506f7802ba89e4c253601"} Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.691346 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerDied","Data":"f68bc1aa2035aad5d0a37b4d94eca0d99f95e1272057d867be13a66ea9a108f2"} Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.695281 4610 generic.go:334] "Generic (PLEG): container finished" podID="ac78470f-702c-4fa4-a521-2deddbdb6e51" containerID="882160670d83072988d19d3d783db490deed5d5939efd6625ed8a513ef1389ba" exitCode=0 Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.695341 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rf2js" event={"ID":"ac78470f-702c-4fa4-a521-2deddbdb6e51","Type":"ContainerDied","Data":"882160670d83072988d19d3d783db490deed5d5939efd6625ed8a513ef1389ba"} Oct 06 09:00:22 crc kubenswrapper[4610]: I1006 09:00:22.842373 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:23 crc kubenswrapper[4610]: I1006 09:00:23.083187 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1df847-3ff6-4009-9707-a5cacd51d067" path="/var/lib/kubelet/pods/6a1df847-3ff6-4009-9707-a5cacd51d067/volumes" Oct 06 09:00:24 crc kubenswrapper[4610]: I1006 09:00:24.930181 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091394 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091462 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091498 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vtcj\" (UniqueName: \"kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091548 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091647 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.091688 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data\") pod \"ac78470f-702c-4fa4-a521-2deddbdb6e51\" (UID: \"ac78470f-702c-4fa4-a521-2deddbdb6e51\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.102237 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.102780 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.102837 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj" (OuterVolumeSpecName: "kube-api-access-9vtcj") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "kube-api-access-9vtcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.104044 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts" (OuterVolumeSpecName: "scripts") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.143131 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data" (OuterVolumeSpecName: "config-data") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.180279 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac78470f-702c-4fa4-a521-2deddbdb6e51" (UID: "ac78470f-702c-4fa4-a521-2deddbdb6e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197244 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197274 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197286 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vtcj\" (UniqueName: \"kubernetes.io/projected/ac78470f-702c-4fa4-a521-2deddbdb6e51-kube-api-access-9vtcj\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197294 4610 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197306 4610 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.197313 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac78470f-702c-4fa4-a521-2deddbdb6e51-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.287506 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.508692 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.591346 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606057 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606108 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606147 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606173 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hwq\" (UniqueName: \"kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606247 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606477 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.606508 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run\") pod \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\" (UID: \"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab\") " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.607982 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs" (OuterVolumeSpecName: "logs") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.608211 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.623169 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts" (OuterVolumeSpecName: "scripts") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.625104 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.627909 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq" (OuterVolumeSpecName: "kube-api-access-28hwq") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "kube-api-access-28hwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.661641 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.661905 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" containerID="cri-o://05cd332531da528c7c4b53c3febbebd690c03cf920eb818592abea5271832907" gracePeriod=10 Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.669653 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710732 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710953 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710962 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710971 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710979 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.710986 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hwq\" (UniqueName: \"kubernetes.io/projected/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-kube-api-access-28hwq\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.715265 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data" (OuterVolumeSpecName: "config-data") pod "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" (UID: "7461aacb-8aa0-4a3d-8832-b60ccb2b26ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.727652 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerStarted","Data":"23ac11d4202f4d98e48046deb3aa458a811933860c13a884f2bdbe0a69bbd191"} Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.736859 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rf2js" event={"ID":"ac78470f-702c-4fa4-a521-2deddbdb6e51","Type":"ContainerDied","Data":"5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029"} Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.736914 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ebad10d4a0ba2ca2c74c1585f1bef4f3ed602dd3102c5bae5b4bdb705040029" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.736894 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rf2js" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.739581 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-29xd8"] Oct 06 09:00:25 crc kubenswrapper[4610]: E1006 09:00:25.739953 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-httpd" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.739976 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-httpd" Oct 06 09:00:25 crc kubenswrapper[4610]: E1006 09:00:25.739995 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-log" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740003 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-log" Oct 06 09:00:25 crc kubenswrapper[4610]: E1006 09:00:25.740031 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac78470f-702c-4fa4-a521-2deddbdb6e51" containerName="keystone-bootstrap" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740037 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac78470f-702c-4fa4-a521-2deddbdb6e51" containerName="keystone-bootstrap" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740194 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac78470f-702c-4fa4-a521-2deddbdb6e51" containerName="keystone-bootstrap" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740213 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-httpd" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740229 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" containerName="glance-log" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.740768 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.744406 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f75p4" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.744602 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.757200 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-29xd8"] Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.764353 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7461aacb-8aa0-4a3d-8832-b60ccb2b26ab","Type":"ContainerDied","Data":"3c2351031f77ef230c4dddf19a7b0952e2a8dcb676e662784ea6f761579f9e87"} Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.764401 4610 scope.go:117] "RemoveContainer" containerID="0aacb6a9d515466a7c32193f976c085d73f1fabeac5506f7802ba89e4c253601" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.764561 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.766929 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jflcj"] Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.767933 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.775197 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n464c" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.775115 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.775369 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.776711 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.812811 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.812840 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.813633 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jflcj"] Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.841738 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.913683 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914029 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914194 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914343 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqfm\" (UniqueName: \"kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914424 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914503 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9cx\" (UniqueName: \"kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914656 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914765 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.914835 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:25 crc kubenswrapper[4610]: I1006 09:00:25.993600 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-t9gmg"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:25.994657 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:25.997033 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmxdv" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:25.997227 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:25.997376 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.001806 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t9gmg"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016436 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9cx\" (UniqueName: \"kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016551 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016603 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016626 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016661 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016688 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016728 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016751 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsqfm\" (UniqueName: \"kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.016772 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.027829 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.033830 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.043236 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.044933 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.049951 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.057183 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.075703 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.083589 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsqfm\" (UniqueName: \"kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm\") pod \"barbican-db-sync-29xd8\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.084293 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9cx\" (UniqueName: \"kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx\") pod \"cinder-db-sync-jflcj\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.086899 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rf2js"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.118768 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.118848 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.118934 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb88d\" (UniqueName: \"kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.125013 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rf2js"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.139212 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j2w82"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.141904 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.145367 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-md8m4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.149185 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.149858 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.150079 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.170950 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29xd8" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.214798 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j2w82"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.217831 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jflcj" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227243 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227342 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227443 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227488 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5scr9\" (UniqueName: \"kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227602 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb88d\" (UniqueName: \"kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227626 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227642 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227718 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.227752 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.235008 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.283678 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.290334 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb88d\" (UniqueName: \"kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d\") pod \"neutron-db-sync-t9gmg\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.296264 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.317149 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.320908 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.323341 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.325877 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.326014 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.328500 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.328862 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5scr9\" (UniqueName: \"kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.328930 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.329573 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.329641 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.330262 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.330701 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.339009 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.339430 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.339581 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.341612 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.349199 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.355287 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5scr9\" (UniqueName: \"kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9\") pod \"keystone-bootstrap-j2w82\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: E1006 09:00:26.415174 4610 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7461aacb_8aa0_4a3d_8832_b60ccb2b26ab.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7461aacb_8aa0_4a3d_8832_b60ccb2b26ab.slice/crio-3c2351031f77ef230c4dddf19a7b0952e2a8dcb676e662784ea6f761579f9e87\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac78470f_702c_4fa4_a521_2deddbdb6e51.slice\": RecentStats: unable to find data in memory cache]" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.432894 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.432949 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.432992 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.433026 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.433181 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfrc\" (UniqueName: \"kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.433215 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.433247 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.433299 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.517087 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.522603 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.523283 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.534757 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.535034 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.535273 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.535955 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.536159 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfrc\" (UniqueName: \"kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.536247 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.535559 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.535550 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.536259 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.536468 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.536609 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.557887 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.557995 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.558527 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.564118 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.572971 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.577764 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfrc\" (UniqueName: \"kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.585174 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.585480 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.585635 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.588529 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638131 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638471 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638628 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638728 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmx6\" (UniqueName: \"kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638831 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.638960 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.639118 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.644159 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.653795 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.664911 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.688141 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-868f4bc56b-f2np4"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.693580 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.717908 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868f4bc56b-f2np4"] Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741392 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741462 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741498 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741526 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmx6\" (UniqueName: \"kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741551 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741570 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.741601 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.743001 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.743320 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.744694 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.747034 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.748714 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.756633 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.767465 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmx6\" (UniqueName: \"kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6\") pod \"horizon-8454b778cb-f7b67\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.776135 4610 generic.go:334] "Generic (PLEG): container finished" podID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerID="05cd332531da528c7c4b53c3febbebd690c03cf920eb818592abea5271832907" exitCode=0 Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.776200 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" event={"ID":"9a9f3925-16de-4002-919d-413e1d94a7c0","Type":"ContainerDied","Data":"05cd332531da528c7c4b53c3febbebd690c03cf920eb818592abea5271832907"} Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.842847 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-combined-ca-bundle\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.842907 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-config-data\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.842952 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0843392c-2df1-4619-9745-21ca7d06a589-logs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.842972 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjp7\" (UniqueName: \"kubernetes.io/projected/0843392c-2df1-4619-9745-21ca7d06a589-kube-api-access-wqjp7\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.843029 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-secret-key\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.843092 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-tls-certs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.843114 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-scripts\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.913955 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944369 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-config-data\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944443 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0843392c-2df1-4619-9745-21ca7d06a589-logs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944465 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjp7\" (UniqueName: \"kubernetes.io/projected/0843392c-2df1-4619-9745-21ca7d06a589-kube-api-access-wqjp7\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944524 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-secret-key\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944565 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-tls-certs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944589 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-scripts\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.944637 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-combined-ca-bundle\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.946112 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-scripts\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.946384 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0843392c-2df1-4619-9745-21ca7d06a589-logs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.947024 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0843392c-2df1-4619-9745-21ca7d06a589-config-data\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.948640 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-combined-ca-bundle\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.955853 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-secret-key\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.956229 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0843392c-2df1-4619-9745-21ca7d06a589-horizon-tls-certs\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:26 crc kubenswrapper[4610]: I1006 09:00:26.965119 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjp7\" (UniqueName: \"kubernetes.io/projected/0843392c-2df1-4619-9745-21ca7d06a589-kube-api-access-wqjp7\") pod \"horizon-868f4bc56b-f2np4\" (UID: \"0843392c-2df1-4619-9745-21ca7d06a589\") " pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:27 crc kubenswrapper[4610]: I1006 09:00:27.010729 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:27 crc kubenswrapper[4610]: I1006 09:00:27.081328 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7461aacb-8aa0-4a3d-8832-b60ccb2b26ab" path="/var/lib/kubelet/pods/7461aacb-8aa0-4a3d-8832-b60ccb2b26ab/volumes" Oct 06 09:00:27 crc kubenswrapper[4610]: I1006 09:00:27.082432 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac78470f-702c-4fa4-a521-2deddbdb6e51" path="/var/lib/kubelet/pods/ac78470f-702c-4fa4-a521-2deddbdb6e51/volumes" Oct 06 09:00:30 crc kubenswrapper[4610]: I1006 09:00:30.840867 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 06 09:00:35 crc kubenswrapper[4610]: I1006 09:00:35.842030 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 06 09:00:35 crc kubenswrapper[4610]: I1006 09:00:35.842650 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 09:00:39 crc kubenswrapper[4610]: E1006 09:00:39.515714 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 06 09:00:39 crc kubenswrapper[4610]: E1006 09:00:39.516158 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647hfbh578hcfhfh58dh94h5ddh575h76h5ch548h55dh55h67fh5b4h5fbh5ddh66bh5f8h5d5h5cfh8bh78h9dh597hdfh58bhc5hdhfbhcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdwk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5cd9db456f-6rdvm_openstack(430ee76b-f17b-4059-bb2e-5f87cf6016d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:00:40 crc kubenswrapper[4610]: I1006 09:00:40.841611 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 06 09:00:40 crc kubenswrapper[4610]: E1006 09:00:40.892194 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5cd9db456f-6rdvm" podUID="430ee76b-f17b-4059-bb2e-5f87cf6016d6" Oct 06 09:00:41 crc kubenswrapper[4610]: E1006 09:00:41.882245 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 06 09:00:41 crc kubenswrapper[4610]: E1006 09:00:41.882654 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b5h5dh5f9h696h5fhcch645h699hf9h6h585h649h596h85h668h678h687h5dch658h67dhfdh56fh6ch8fh649h559h645h559h679h59ch6ch58cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htrnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f7968cb5c-gnj87_openstack(2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:00:41 crc kubenswrapper[4610]: I1006 09:00:41.894464 4610 scope.go:117] "RemoveContainer" containerID="f68bc1aa2035aad5d0a37b4d94eca0d99f95e1272057d867be13a66ea9a108f2" Oct 06 09:00:41 crc kubenswrapper[4610]: E1006 09:00:41.894506 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f7968cb5c-gnj87" podUID="2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" Oct 06 09:00:41 crc kubenswrapper[4610]: I1006 09:00:41.974127 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd9db456f-6rdvm" event={"ID":"430ee76b-f17b-4059-bb2e-5f87cf6016d6","Type":"ContainerDied","Data":"35db6e06f197d26d5a919bc529fb90d3650812d82938d825fe78aa6babcf13dc"} Oct 06 09:00:41 crc kubenswrapper[4610]: I1006 09:00:41.974419 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35db6e06f197d26d5a919bc529fb90d3650812d82938d825fe78aa6babcf13dc" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.061709 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" event={"ID":"9a9f3925-16de-4002-919d-413e1d94a7c0","Type":"ContainerDied","Data":"d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1"} Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.061743 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12d9adde4107d3c8ef001a80ea49b6f84a57fe52d365b1bfbb93a96f07994b1" Oct 06 09:00:42 crc kubenswrapper[4610]: E1006 09:00:42.114387 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 06 09:00:42 crc kubenswrapper[4610]: E1006 09:00:42.114514 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n669hd5h675h58bh5bdhc8h64ch5d8h57h57chd4h54dh695h6dh646h577h595h698h5dbh699h547h56h74h547hf5h664hf5h69h79h565h5d9h576q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpwdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86bb58785f-lmpd2_openstack(8dff6a2f-b759-443f-8c8f-50af38096244): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:00:42 crc kubenswrapper[4610]: E1006 09:00:42.128695 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-86bb58785f-lmpd2" podUID="8dff6a2f-b759-443f-8c8f-50af38096244" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.176711 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.176823 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.280640 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts\") pod \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.280878 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb\") pod \"9a9f3925-16de-4002-919d-413e1d94a7c0\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.280904 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwk7\" (UniqueName: \"kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7\") pod \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.280948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrl8l\" (UniqueName: \"kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l\") pod \"9a9f3925-16de-4002-919d-413e1d94a7c0\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281174 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs\") pod \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281180 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts" (OuterVolumeSpecName: "scripts") pod "430ee76b-f17b-4059-bb2e-5f87cf6016d6" (UID: "430ee76b-f17b-4059-bb2e-5f87cf6016d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281206 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config\") pod \"9a9f3925-16de-4002-919d-413e1d94a7c0\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281253 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc\") pod \"9a9f3925-16de-4002-919d-413e1d94a7c0\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281282 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb\") pod \"9a9f3925-16de-4002-919d-413e1d94a7c0\" (UID: \"9a9f3925-16de-4002-919d-413e1d94a7c0\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281408 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key\") pod \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.281430 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data\") pod \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\" (UID: \"430ee76b-f17b-4059-bb2e-5f87cf6016d6\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.282108 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.282683 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data" (OuterVolumeSpecName: "config-data") pod "430ee76b-f17b-4059-bb2e-5f87cf6016d6" (UID: "430ee76b-f17b-4059-bb2e-5f87cf6016d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.283671 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs" (OuterVolumeSpecName: "logs") pod "430ee76b-f17b-4059-bb2e-5f87cf6016d6" (UID: "430ee76b-f17b-4059-bb2e-5f87cf6016d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.290277 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l" (OuterVolumeSpecName: "kube-api-access-xrl8l") pod "9a9f3925-16de-4002-919d-413e1d94a7c0" (UID: "9a9f3925-16de-4002-919d-413e1d94a7c0"). InnerVolumeSpecName "kube-api-access-xrl8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.291211 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7" (OuterVolumeSpecName: "kube-api-access-tdwk7") pod "430ee76b-f17b-4059-bb2e-5f87cf6016d6" (UID: "430ee76b-f17b-4059-bb2e-5f87cf6016d6"). InnerVolumeSpecName "kube-api-access-tdwk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.299320 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "430ee76b-f17b-4059-bb2e-5f87cf6016d6" (UID: "430ee76b-f17b-4059-bb2e-5f87cf6016d6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.361527 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a9f3925-16de-4002-919d-413e1d94a7c0" (UID: "9a9f3925-16de-4002-919d-413e1d94a7c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.365526 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-29xd8"] Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.381034 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config" (OuterVolumeSpecName: "config") pod "9a9f3925-16de-4002-919d-413e1d94a7c0" (UID: "9a9f3925-16de-4002-919d-413e1d94a7c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: W1006 09:00:42.381122 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32dd0fa5_c2b0_48dc_a81f_bfa7c58ecda5.slice/crio-d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6 WatchSource:0}: Error finding container d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6: Status 404 returned error can't find the container with id d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6 Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383747 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwk7\" (UniqueName: \"kubernetes.io/projected/430ee76b-f17b-4059-bb2e-5f87cf6016d6-kube-api-access-tdwk7\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383764 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrl8l\" (UniqueName: \"kubernetes.io/projected/9a9f3925-16de-4002-919d-413e1d94a7c0-kube-api-access-xrl8l\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383773 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ee76b-f17b-4059-bb2e-5f87cf6016d6-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383783 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383791 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383801 4610 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/430ee76b-f17b-4059-bb2e-5f87cf6016d6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.383812 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/430ee76b-f17b-4059-bb2e-5f87cf6016d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.390269 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a9f3925-16de-4002-919d-413e1d94a7c0" (UID: "9a9f3925-16de-4002-919d-413e1d94a7c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.433092 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a9f3925-16de-4002-919d-413e1d94a7c0" (UID: "9a9f3925-16de-4002-919d-413e1d94a7c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.485083 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.485112 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a9f3925-16de-4002-919d-413e1d94a7c0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.596347 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:42 crc kubenswrapper[4610]: W1006 09:00:42.603472 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod410fe180_3cd7_4905_9c10_51d51c8e7152.slice/crio-5657bd1655f35bc304bab97a256e9e296eed9f6f86632c1d0e7752761effab6a WatchSource:0}: Error finding container 5657bd1655f35bc304bab97a256e9e296eed9f6f86632c1d0e7752761effab6a: Status 404 returned error can't find the container with id 5657bd1655f35bc304bab97a256e9e296eed9f6f86632c1d0e7752761effab6a Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.661066 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.678256 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jflcj"] Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.691638 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j2w82"] Oct 06 09:00:42 crc kubenswrapper[4610]: W1006 09:00:42.693437 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d65a6aa_72d2_4b32_b19f_b76c50c13bc8.slice/crio-9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7 WatchSource:0}: Error finding container 9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7: Status 404 returned error can't find the container with id 9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7 Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.838594 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868f4bc56b-f2np4"] Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.844582 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.851853 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t9gmg"] Oct 06 09:00:42 crc kubenswrapper[4610]: W1006 09:00:42.857307 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0843392c_2df1_4619_9745_21ca7d06a589.slice/crio-efe1e272adbdaedc8928c8f27b68d548e294d5e5c31dd7b5512dfcccc5a33152 WatchSource:0}: Error finding container efe1e272adbdaedc8928c8f27b68d548e294d5e5c31dd7b5512dfcccc5a33152: Status 404 returned error can't find the container with id efe1e272adbdaedc8928c8f27b68d548e294d5e5c31dd7b5512dfcccc5a33152 Oct 06 09:00:42 crc kubenswrapper[4610]: W1006 09:00:42.878971 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4408d93d_c733_4032_92fc_df3c6d8d9b0b.slice/crio-1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86 WatchSource:0}: Error finding container 1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86: Status 404 returned error can't find the container with id 1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86 Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.891649 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htrnr\" (UniqueName: \"kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr\") pod \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.891752 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key\") pod \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.891817 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs\") pod \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.891881 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data\") pod \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.891924 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts\") pod \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\" (UID: \"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e\") " Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.892698 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts" (OuterVolumeSpecName: "scripts") pod "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" (UID: "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.893367 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs" (OuterVolumeSpecName: "logs") pod "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" (UID: "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.893930 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data" (OuterVolumeSpecName: "config-data") pod "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" (UID: "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.915624 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr" (OuterVolumeSpecName: "kube-api-access-htrnr") pod "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" (UID: "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e"). InnerVolumeSpecName "kube-api-access-htrnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.924869 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" (UID: "2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.994244 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htrnr\" (UniqueName: \"kubernetes.io/projected/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-kube-api-access-htrnr\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.994497 4610 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.994506 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.994517 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:42 crc kubenswrapper[4610]: I1006 09:00:42.994525 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.073637 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jflcj" event={"ID":"2ca74dbf-7969-4a03-a618-83505fc9c7ec","Type":"ContainerStarted","Data":"f7866e176176a77e9dc26cb0e59f11edc9e033c3851ad535b85050a7e3191c2b"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.079967 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7968cb5c-gnj87" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.127654 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j2w82" podStartSLOduration=17.127627821 podStartE2EDuration="17.127627821s" podCreationTimestamp="2025-10-06 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:43.101617368 +0000 UTC m=+1174.816670776" watchObservedRunningTime="2025-10-06 09:00:43.127627821 +0000 UTC m=+1174.842681219" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.130487 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-glndl" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.130653 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd9db456f-6rdvm" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142665 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerStarted","Data":"0bd002428c63ea24bde01cc2ebfcc6f6e04db5222fda755c5ee2e873d4a7eb67"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142704 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868f4bc56b-f2np4" event={"ID":"0843392c-2df1-4619-9745-21ca7d06a589","Type":"ContainerStarted","Data":"efe1e272adbdaedc8928c8f27b68d548e294d5e5c31dd7b5512dfcccc5a33152"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142722 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t9gmg" event={"ID":"4408d93d-c733-4032-92fc-df3c6d8d9b0b","Type":"ContainerStarted","Data":"1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142736 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f7968cb5c-gnj87" event={"ID":"2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e","Type":"ContainerDied","Data":"52fb2d9e9088dabde142b4325c07e771887fa2a9ffbe9e34cb031c77fdc020c6"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142749 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerStarted","Data":"28241f3730c57708bb8af3e326c0d9becd87ffa9455da4c95569af0581f8846d"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142761 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j2w82" event={"ID":"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8","Type":"ContainerStarted","Data":"ca99402205ea54a0920134ae7e29d90ddbc983cc185e4df7dc02c32c47bee88d"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142774 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j2w82" event={"ID":"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8","Type":"ContainerStarted","Data":"9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142785 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerStarted","Data":"5657bd1655f35bc304bab97a256e9e296eed9f6f86632c1d0e7752761effab6a"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142809 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29xd8" event={"ID":"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5","Type":"ContainerStarted","Data":"d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.142822 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zx98b" event={"ID":"72ba2911-ba6a-40d2-b05e-011016c788c4","Type":"ContainerStarted","Data":"7a89fd45c8d4064110c7d6cc28d42916714e4c5693e04f4e839f1fa6653ce909"} Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.337508 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.349519 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f7968cb5c-gnj87"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.383341 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zx98b" podStartSLOduration=4.41005651 podStartE2EDuration="29.383321496s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="2025-10-06 09:00:16.934756211 +0000 UTC m=+1148.649809589" lastFinishedPulling="2025-10-06 09:00:41.908021167 +0000 UTC m=+1173.623074575" observedRunningTime="2025-10-06 09:00:43.240485357 +0000 UTC m=+1174.955538745" watchObservedRunningTime="2025-10-06 09:00:43.383321496 +0000 UTC m=+1175.098374894" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.436931 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.451156 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cd9db456f-6rdvm"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.457897 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.481166 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-glndl"] Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.712861 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.858645 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key\") pod \"8dff6a2f-b759-443f-8c8f-50af38096244\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.858840 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpwdp\" (UniqueName: \"kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp\") pod \"8dff6a2f-b759-443f-8c8f-50af38096244\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.858874 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts\") pod \"8dff6a2f-b759-443f-8c8f-50af38096244\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.858906 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs\") pod \"8dff6a2f-b759-443f-8c8f-50af38096244\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.858954 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data\") pod \"8dff6a2f-b759-443f-8c8f-50af38096244\" (UID: \"8dff6a2f-b759-443f-8c8f-50af38096244\") " Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.859983 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data" (OuterVolumeSpecName: "config-data") pod "8dff6a2f-b759-443f-8c8f-50af38096244" (UID: "8dff6a2f-b759-443f-8c8f-50af38096244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.861011 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts" (OuterVolumeSpecName: "scripts") pod "8dff6a2f-b759-443f-8c8f-50af38096244" (UID: "8dff6a2f-b759-443f-8c8f-50af38096244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.861405 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs" (OuterVolumeSpecName: "logs") pod "8dff6a2f-b759-443f-8c8f-50af38096244" (UID: "8dff6a2f-b759-443f-8c8f-50af38096244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.865134 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8dff6a2f-b759-443f-8c8f-50af38096244" (UID: "8dff6a2f-b759-443f-8c8f-50af38096244"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.876150 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp" (OuterVolumeSpecName: "kube-api-access-gpwdp") pod "8dff6a2f-b759-443f-8c8f-50af38096244" (UID: "8dff6a2f-b759-443f-8c8f-50af38096244"). InnerVolumeSpecName "kube-api-access-gpwdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.960452 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpwdp\" (UniqueName: \"kubernetes.io/projected/8dff6a2f-b759-443f-8c8f-50af38096244-kube-api-access-gpwdp\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.960788 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.960797 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dff6a2f-b759-443f-8c8f-50af38096244-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.960806 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dff6a2f-b759-443f-8c8f-50af38096244-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:43 crc kubenswrapper[4610]: I1006 09:00:43.960816 4610 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dff6a2f-b759-443f-8c8f-50af38096244-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.171056 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868f4bc56b-f2np4" event={"ID":"0843392c-2df1-4619-9745-21ca7d06a589","Type":"ContainerStarted","Data":"16de65e62c295199bac78a6306fde1d837bdf07ee10da3d8fb509084a56cf6f6"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.174800 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86bb58785f-lmpd2" event={"ID":"8dff6a2f-b759-443f-8c8f-50af38096244","Type":"ContainerDied","Data":"e2330802b23936adb07bd059da3efa1e1382259fefde1e0e7ba966becd70eac5"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.174843 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86bb58785f-lmpd2" Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.179276 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerStarted","Data":"2ebd7ce0315bfb153503159b9036aabb2743d8df7c95bf2ab2d228d3b05456fd"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.181666 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerStarted","Data":"36f3daf156ca64d3782a90b5a98f9b3a5f6d29ba24ef4520b0ce121caaafcada"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.183255 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t9gmg" event={"ID":"4408d93d-c733-4032-92fc-df3c6d8d9b0b","Type":"ContainerStarted","Data":"e286f948c1b89a9b3231a2a8a2d32f066aeffb1cfdc87f67ab136f30ff81987b"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.189916 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerStarted","Data":"980b61421f9d5498d548880b365ac0717f97ce6a22c27c5f878661d138a2b3b7"} Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.207505 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-t9gmg" podStartSLOduration=19.207489613 podStartE2EDuration="19.207489613s" podCreationTimestamp="2025-10-06 09:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:44.198907188 +0000 UTC m=+1175.913960576" watchObservedRunningTime="2025-10-06 09:00:44.207489613 +0000 UTC m=+1175.922543001" Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.251496 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:44 crc kubenswrapper[4610]: I1006 09:00:44.251543 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86bb58785f-lmpd2"] Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.086697 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e" path="/var/lib/kubelet/pods/2a0dbbd8-8bb8-4858-ab3e-d5831c9f202e/volumes" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.088586 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430ee76b-f17b-4059-bb2e-5f87cf6016d6" path="/var/lib/kubelet/pods/430ee76b-f17b-4059-bb2e-5f87cf6016d6/volumes" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.089131 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dff6a2f-b759-443f-8c8f-50af38096244" path="/var/lib/kubelet/pods/8dff6a2f-b759-443f-8c8f-50af38096244/volumes" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.089575 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" path="/var/lib/kubelet/pods/9a9f3925-16de-4002-919d-413e1d94a7c0/volumes" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.223565 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868f4bc56b-f2np4" event={"ID":"0843392c-2df1-4619-9745-21ca7d06a589","Type":"ContainerStarted","Data":"f99801533c6dfaf601cb8ab5b6664388e2402654cb730a6161275354cdc11d2e"} Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.226517 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerStarted","Data":"f652a53e9131360249f052d54abcf3594dc895ae1a92ff339e8d12f93f5b7ae5"} Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.226728 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-log" containerID="cri-o://2ebd7ce0315bfb153503159b9036aabb2743d8df7c95bf2ab2d228d3b05456fd" gracePeriod=30 Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.227096 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-httpd" containerID="cri-o://f652a53e9131360249f052d54abcf3594dc895ae1a92ff339e8d12f93f5b7ae5" gracePeriod=30 Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.239338 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerStarted","Data":"cbf3ee2abd1a5d01589add09cb0608d3f5601bdb2b0c1c9b78ae63c41d1f1552"} Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.239486 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-log" containerID="cri-o://36f3daf156ca64d3782a90b5a98f9b3a5f6d29ba24ef4520b0ce121caaafcada" gracePeriod=30 Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.239758 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-httpd" containerID="cri-o://cbf3ee2abd1a5d01589add09cb0608d3f5601bdb2b0c1c9b78ae63c41d1f1552" gracePeriod=30 Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.263555 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-868f4bc56b-f2np4" podStartSLOduration=18.530136401 podStartE2EDuration="19.263538138s" podCreationTimestamp="2025-10-06 09:00:26 +0000 UTC" firstStartedPulling="2025-10-06 09:00:42.869360992 +0000 UTC m=+1174.584414380" lastFinishedPulling="2025-10-06 09:00:43.602762729 +0000 UTC m=+1175.317816117" observedRunningTime="2025-10-06 09:00:45.259007384 +0000 UTC m=+1176.974060792" watchObservedRunningTime="2025-10-06 09:00:45.263538138 +0000 UTC m=+1176.978591526" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.264518 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerStarted","Data":"44f2e86c3222c2dd45912193c4924309433a91696bc5355fe2a799b451e98b1f"} Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.299005 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.298982879 podStartE2EDuration="19.298982879s" podCreationTimestamp="2025-10-06 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:45.293232564 +0000 UTC m=+1177.008285952" watchObservedRunningTime="2025-10-06 09:00:45.298982879 +0000 UTC m=+1177.014036277" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.318656 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.318632833 podStartE2EDuration="24.318632833s" podCreationTimestamp="2025-10-06 09:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:00:45.309988695 +0000 UTC m=+1177.025042093" watchObservedRunningTime="2025-10-06 09:00:45.318632833 +0000 UTC m=+1177.033686251" Oct 06 09:00:45 crc kubenswrapper[4610]: I1006 09:00:45.345653 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8454b778cb-f7b67" podStartSLOduration=18.46164911 podStartE2EDuration="19.345627511s" podCreationTimestamp="2025-10-06 09:00:26 +0000 UTC" firstStartedPulling="2025-10-06 09:00:42.686947278 +0000 UTC m=+1174.402000666" lastFinishedPulling="2025-10-06 09:00:43.570925679 +0000 UTC m=+1175.285979067" observedRunningTime="2025-10-06 09:00:45.340168764 +0000 UTC m=+1177.055222162" watchObservedRunningTime="2025-10-06 09:00:45.345627511 +0000 UTC m=+1177.060680899" Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.273784 4610 generic.go:334] "Generic (PLEG): container finished" podID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerID="f652a53e9131360249f052d54abcf3594dc895ae1a92ff339e8d12f93f5b7ae5" exitCode=143 Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.273821 4610 generic.go:334] "Generic (PLEG): container finished" podID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerID="2ebd7ce0315bfb153503159b9036aabb2743d8df7c95bf2ab2d228d3b05456fd" exitCode=143 Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.273862 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerDied","Data":"f652a53e9131360249f052d54abcf3594dc895ae1a92ff339e8d12f93f5b7ae5"} Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.273892 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerDied","Data":"2ebd7ce0315bfb153503159b9036aabb2743d8df7c95bf2ab2d228d3b05456fd"} Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.278570 4610 generic.go:334] "Generic (PLEG): container finished" podID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerID="cbf3ee2abd1a5d01589add09cb0608d3f5601bdb2b0c1c9b78ae63c41d1f1552" exitCode=143 Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.278603 4610 generic.go:334] "Generic (PLEG): container finished" podID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerID="36f3daf156ca64d3782a90b5a98f9b3a5f6d29ba24ef4520b0ce121caaafcada" exitCode=143 Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.279560 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerDied","Data":"cbf3ee2abd1a5d01589add09cb0608d3f5601bdb2b0c1c9b78ae63c41d1f1552"} Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.279587 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerDied","Data":"36f3daf156ca64d3782a90b5a98f9b3a5f6d29ba24ef4520b0ce121caaafcada"} Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.469201 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.469295 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.469366 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.470497 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.470598 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9" gracePeriod=600 Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.914186 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:46 crc kubenswrapper[4610]: I1006 09:00:46.914667 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.011679 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.011724 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.296567 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9" exitCode=0 Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.297595 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9"} Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.297676 4610 scope.go:117] "RemoveContainer" containerID="7bdba77b46e82044baaa28f03a702e74591a001a85966cb8cf3dd9e4ff7e62b2" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.811855 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939541 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939628 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939659 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939694 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnsdt\" (UniqueName: \"kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939808 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939872 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.939977 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"96d0277d-2925-435b-9b4f-c3d83a605e5d\" (UID: \"96d0277d-2925-435b-9b4f-c3d83a605e5d\") " Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.945376 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt" (OuterVolumeSpecName: "kube-api-access-pnsdt") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "kube-api-access-pnsdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.945898 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.946035 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs" (OuterVolumeSpecName: "logs") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.946377 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.949818 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts" (OuterVolumeSpecName: "scripts") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:47 crc kubenswrapper[4610]: I1006 09:00:47.999905 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.024227 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data" (OuterVolumeSpecName: "config-data") pod "96d0277d-2925-435b-9b4f-c3d83a605e5d" (UID: "96d0277d-2925-435b-9b4f-c3d83a605e5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042480 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042519 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042554 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042564 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042574 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96d0277d-2925-435b-9b4f-c3d83a605e5d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042582 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d0277d-2925-435b-9b4f-c3d83a605e5d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.042590 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnsdt\" (UniqueName: \"kubernetes.io/projected/96d0277d-2925-435b-9b4f-c3d83a605e5d-kube-api-access-pnsdt\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.061078 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.144474 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.322503 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659"} Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.326694 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96d0277d-2925-435b-9b4f-c3d83a605e5d","Type":"ContainerDied","Data":"23ac11d4202f4d98e48046deb3aa458a811933860c13a884f2bdbe0a69bbd191"} Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.326747 4610 scope.go:117] "RemoveContainer" containerID="cbf3ee2abd1a5d01589add09cb0608d3f5601bdb2b0c1c9b78ae63c41d1f1552" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.326880 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.373162 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.387341 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.422029 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:48 crc kubenswrapper[4610]: E1006 09:00:48.422956 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-log" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.422968 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-log" Oct 06 09:00:48 crc kubenswrapper[4610]: E1006 09:00:48.422987 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="init" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.422994 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="init" Oct 06 09:00:48 crc kubenswrapper[4610]: E1006 09:00:48.423012 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.423017 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" Oct 06 09:00:48 crc kubenswrapper[4610]: E1006 09:00:48.423211 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-httpd" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.423219 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-httpd" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.423507 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-log" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.423524 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" containerName="glance-httpd" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.423564 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9f3925-16de-4002-919d-413e1d94a7c0" containerName="dnsmasq-dns" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.424594 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.428052 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.428356 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.462650 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.553124 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.553510 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.553643 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.553841 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.553955 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.554064 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndbv\" (UniqueName: \"kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.554248 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.554487 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656383 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656444 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndbv\" (UniqueName: \"kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656531 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656553 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656592 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656655 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656697 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656756 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.656863 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.657385 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.657752 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.663513 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.666516 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.668984 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.670455 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.675066 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndbv\" (UniqueName: \"kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.685014 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:00:48 crc kubenswrapper[4610]: I1006 09:00:48.747340 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:00:49 crc kubenswrapper[4610]: I1006 09:00:49.089153 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d0277d-2925-435b-9b4f-c3d83a605e5d" path="/var/lib/kubelet/pods/96d0277d-2925-435b-9b4f-c3d83a605e5d/volumes" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.375168 4610 generic.go:334] "Generic (PLEG): container finished" podID="3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" containerID="ca99402205ea54a0920134ae7e29d90ddbc983cc185e4df7dc02c32c47bee88d" exitCode=0 Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.375236 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j2w82" event={"ID":"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8","Type":"ContainerDied","Data":"ca99402205ea54a0920134ae7e29d90ddbc983cc185e4df7dc02c32c47bee88d"} Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.814655 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.913907 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914093 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914167 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914199 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kfrc\" (UniqueName: \"kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914246 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914333 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914425 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.914460 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run\") pod \"410fe180-3cd7-4905-9c10-51d51c8e7152\" (UID: \"410fe180-3cd7-4905-9c10-51d51c8e7152\") " Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.915812 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs" (OuterVolumeSpecName: "logs") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.916754 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.917483 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.917509 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410fe180-3cd7-4905-9c10-51d51c8e7152-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.942259 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts" (OuterVolumeSpecName: "scripts") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.942400 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.954363 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc" (OuterVolumeSpecName: "kube-api-access-5kfrc") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "kube-api-access-5kfrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:51 crc kubenswrapper[4610]: I1006 09:00:51.976324 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.019045 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kfrc\" (UniqueName: \"kubernetes.io/projected/410fe180-3cd7-4905-9c10-51d51c8e7152-kube-api-access-5kfrc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.019266 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.019352 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.019460 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.048358 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.088527 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.107072 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data" (OuterVolumeSpecName: "config-data") pod "410fe180-3cd7-4905-9c10-51d51c8e7152" (UID: "410fe180-3cd7-4905-9c10-51d51c8e7152"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.121992 4610 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.122240 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.122410 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410fe180-3cd7-4905-9c10-51d51c8e7152-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.386877 4610 generic.go:334] "Generic (PLEG): container finished" podID="72ba2911-ba6a-40d2-b05e-011016c788c4" containerID="7a89fd45c8d4064110c7d6cc28d42916714e4c5693e04f4e839f1fa6653ce909" exitCode=0 Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.386972 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zx98b" event={"ID":"72ba2911-ba6a-40d2-b05e-011016c788c4","Type":"ContainerDied","Data":"7a89fd45c8d4064110c7d6cc28d42916714e4c5693e04f4e839f1fa6653ce909"} Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.391194 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerStarted","Data":"b918cc7af1e0f0820468950e40adf5227e585211f91b5c0034a1d5555a254819"} Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.396120 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.406769 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"410fe180-3cd7-4905-9c10-51d51c8e7152","Type":"ContainerDied","Data":"5657bd1655f35bc304bab97a256e9e296eed9f6f86632c1d0e7752761effab6a"} Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.443070 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.455216 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.473893 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:52 crc kubenswrapper[4610]: E1006 09:00:52.474594 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-log" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.474614 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-log" Oct 06 09:00:52 crc kubenswrapper[4610]: E1006 09:00:52.474641 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-httpd" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.474669 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-httpd" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.474950 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-log" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.474979 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" containerName="glance-httpd" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.476635 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.497576 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.498230 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.521089 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628697 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628770 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628816 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sn5v\" (UniqueName: \"kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628841 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628897 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628949 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.628968 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731471 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731516 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731569 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sn5v\" (UniqueName: \"kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731600 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731668 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731703 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731727 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.731754 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.732255 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.733507 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.735763 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.740506 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.744803 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.745403 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.747766 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.750543 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sn5v\" (UniqueName: \"kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.788795 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " pod="openstack/glance-default-external-api-0" Oct 06 09:00:52 crc kubenswrapper[4610]: I1006 09:00:52.821249 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:00:53 crc kubenswrapper[4610]: I1006 09:00:53.090176 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410fe180-3cd7-4905-9c10-51d51c8e7152" path="/var/lib/kubelet/pods/410fe180-3cd7-4905-9c10-51d51c8e7152/volumes" Oct 06 09:00:56 crc kubenswrapper[4610]: I1006 09:00:56.919785 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 06 09:00:57 crc kubenswrapper[4610]: I1006 09:00:57.013701 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-868f4bc56b-f2np4" podUID="0843392c-2df1-4619-9745-21ca7d06a589" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Oct 06 09:01:05 crc kubenswrapper[4610]: E1006 09:01:05.054100 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 06 09:01:05 crc kubenswrapper[4610]: E1006 09:01:05.054929 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jsqfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-29xd8_openstack(32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:01:05 crc kubenswrapper[4610]: E1006 09:01:05.057014 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-29xd8" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.161198 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.161405 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zx98b" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288129 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288215 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts\") pod \"72ba2911-ba6a-40d2-b05e-011016c788c4\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288292 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs\") pod \"72ba2911-ba6a-40d2-b05e-011016c788c4\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288353 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288391 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288453 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5scr9\" (UniqueName: \"kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288503 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data\") pod \"72ba2911-ba6a-40d2-b05e-011016c788c4\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288529 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288564 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle\") pod \"72ba2911-ba6a-40d2-b05e-011016c788c4\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288592 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys\") pod \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\" (UID: \"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.288669 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcdrh\" (UniqueName: \"kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh\") pod \"72ba2911-ba6a-40d2-b05e-011016c788c4\" (UID: \"72ba2911-ba6a-40d2-b05e-011016c788c4\") " Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.294492 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs" (OuterVolumeSpecName: "logs") pod "72ba2911-ba6a-40d2-b05e-011016c788c4" (UID: "72ba2911-ba6a-40d2-b05e-011016c788c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.296524 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.299785 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9" (OuterVolumeSpecName: "kube-api-access-5scr9") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "kube-api-access-5scr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.325635 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts" (OuterVolumeSpecName: "scripts") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.326192 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh" (OuterVolumeSpecName: "kube-api-access-rcdrh") pod "72ba2911-ba6a-40d2-b05e-011016c788c4" (UID: "72ba2911-ba6a-40d2-b05e-011016c788c4"). InnerVolumeSpecName "kube-api-access-rcdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.327939 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts" (OuterVolumeSpecName: "scripts") pod "72ba2911-ba6a-40d2-b05e-011016c788c4" (UID: "72ba2911-ba6a-40d2-b05e-011016c788c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.333244 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.336769 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data" (OuterVolumeSpecName: "config-data") pod "72ba2911-ba6a-40d2-b05e-011016c788c4" (UID: "72ba2911-ba6a-40d2-b05e-011016c788c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.357761 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ba2911-ba6a-40d2-b05e-011016c788c4" (UID: "72ba2911-ba6a-40d2-b05e-011016c788c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.361989 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data" (OuterVolumeSpecName: "config-data") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.372627 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" (UID: "3d65a6aa-72d2-4b32-b19f-b76c50c13bc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391388 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391425 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391438 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ba2911-ba6a-40d2-b05e-011016c788c4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391473 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391487 4610 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391500 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5scr9\" (UniqueName: \"kubernetes.io/projected/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-kube-api-access-5scr9\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391513 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391546 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391559 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ba2911-ba6a-40d2-b05e-011016c788c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391568 4610 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.391747 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcdrh\" (UniqueName: \"kubernetes.io/projected/72ba2911-ba6a-40d2-b05e-011016c788c4-kube-api-access-rcdrh\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.512162 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j2w82" event={"ID":"3d65a6aa-72d2-4b32-b19f-b76c50c13bc8","Type":"ContainerDied","Data":"9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7"} Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.512207 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ebfae9f811874072cb34e003d2e85e547644c1b8e0b125c9b4313d9ac905cf7" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.512180 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j2w82" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.516311 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zx98b" Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.516307 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zx98b" event={"ID":"72ba2911-ba6a-40d2-b05e-011016c788c4","Type":"ContainerDied","Data":"41849fc2fa734b46341e322265a4ee05e7d9354e30a45f7ea0cc264f54603bea"} Oct 06 09:01:05 crc kubenswrapper[4610]: I1006 09:01:05.516467 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41849fc2fa734b46341e322265a4ee05e7d9354e30a45f7ea0cc264f54603bea" Oct 06 09:01:05 crc kubenswrapper[4610]: E1006 09:01:05.523287 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-29xd8" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.303765 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-84759bdbdc-r6gkv"] Oct 06 09:01:06 crc kubenswrapper[4610]: E1006 09:01:06.304257 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" containerName="keystone-bootstrap" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.304276 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" containerName="keystone-bootstrap" Oct 06 09:01:06 crc kubenswrapper[4610]: E1006 09:01:06.304301 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ba2911-ba6a-40d2-b05e-011016c788c4" containerName="placement-db-sync" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.304308 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ba2911-ba6a-40d2-b05e-011016c788c4" containerName="placement-db-sync" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.304581 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" containerName="keystone-bootstrap" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.304604 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ba2911-ba6a-40d2-b05e-011016c788c4" containerName="placement-db-sync" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.305179 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.312320 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.312536 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-md8m4" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.312713 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.312811 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.312902 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.313146 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.335993 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84759bdbdc-r6gkv"] Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.419985 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b7659d8b-729ds"] Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.424884 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.429268 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.429493 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.429617 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qlrf5" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.430146 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.430172 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.433616 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7659d8b-729ds"] Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.506907 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-combined-ca-bundle\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.506964 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-config-data\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.506992 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-credential-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.507011 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s62m4\" (UniqueName: \"kubernetes.io/projected/27ee29ca-3774-42c0-a3d0-164644f89e7d-kube-api-access-s62m4\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.508698 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-scripts\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.508857 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-internal-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.508895 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-fernet-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.508918 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-public-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610165 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-internal-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610204 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-fernet-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610222 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-public-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610243 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkds\" (UniqueName: \"kubernetes.io/projected/a417a1f5-8eba-4d85-9b6e-730463fe2734-kube-api-access-7nkds\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610289 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-public-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610319 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-combined-ca-bundle\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610340 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-config-data\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610362 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-credential-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610378 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s62m4\" (UniqueName: \"kubernetes.io/projected/27ee29ca-3774-42c0-a3d0-164644f89e7d-kube-api-access-s62m4\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610393 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-internal-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610422 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-scripts\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610443 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-combined-ca-bundle\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610475 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-scripts\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610511 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-config-data\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.610533 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a417a1f5-8eba-4d85-9b6e-730463fe2734-logs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.614551 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-credential-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.614968 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-internal-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.618323 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-public-tls-certs\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.618661 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-combined-ca-bundle\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.622764 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-config-data\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.627572 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-scripts\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.631667 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ee29ca-3774-42c0-a3d0-164644f89e7d-fernet-keys\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.639611 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s62m4\" (UniqueName: \"kubernetes.io/projected/27ee29ca-3774-42c0-a3d0-164644f89e7d-kube-api-access-s62m4\") pod \"keystone-84759bdbdc-r6gkv\" (UID: \"27ee29ca-3774-42c0-a3d0-164644f89e7d\") " pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.711787 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-public-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.711880 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-internal-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.711926 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-scripts\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.711944 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-combined-ca-bundle\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.711994 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-config-data\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.712011 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a417a1f5-8eba-4d85-9b6e-730463fe2734-logs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.712087 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkds\" (UniqueName: \"kubernetes.io/projected/a417a1f5-8eba-4d85-9b6e-730463fe2734-kube-api-access-7nkds\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.715817 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a417a1f5-8eba-4d85-9b6e-730463fe2734-logs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.719705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-combined-ca-bundle\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.723662 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-config-data\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.725365 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-scripts\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.725755 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-internal-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.728004 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a417a1f5-8eba-4d85-9b6e-730463fe2734-public-tls-certs\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.729275 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkds\" (UniqueName: \"kubernetes.io/projected/a417a1f5-8eba-4d85-9b6e-730463fe2734-kube-api-access-7nkds\") pod \"placement-b7659d8b-729ds\" (UID: \"a417a1f5-8eba-4d85-9b6e-730463fe2734\") " pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.751801 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.915058 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 06 09:01:06 crc kubenswrapper[4610]: I1006 09:01:06.937418 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:07 crc kubenswrapper[4610]: I1006 09:01:07.011926 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-868f4bc56b-f2np4" podUID="0843392c-2df1-4619-9745-21ca7d06a589" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Oct 06 09:01:07 crc kubenswrapper[4610]: E1006 09:01:07.048737 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 06 09:01:07 crc kubenswrapper[4610]: I1006 09:01:07.048774 4610 scope.go:117] "RemoveContainer" containerID="36f3daf156ca64d3782a90b5a98f9b3a5f6d29ba24ef4520b0ce121caaafcada" Oct 06 09:01:07 crc kubenswrapper[4610]: E1006 09:01:07.048890 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wm9cx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jflcj_openstack(2ca74dbf-7969-4a03-a618-83505fc9c7ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:01:07 crc kubenswrapper[4610]: E1006 09:01:07.050171 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jflcj" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" Oct 06 09:01:07 crc kubenswrapper[4610]: E1006 09:01:07.535034 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jflcj" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" Oct 06 09:01:08 crc kubenswrapper[4610]: I1006 09:01:08.985671 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.151063 4610 scope.go:117] "RemoveContainer" containerID="f652a53e9131360249f052d54abcf3594dc895ae1a92ff339e8d12f93f5b7ae5" Oct 06 09:01:10 crc kubenswrapper[4610]: W1006 09:01:10.164315 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbf0caa2_fa42_437c_8a1a_c691b35f5d3a.slice/crio-0734404a66405d1de3b67fe32832da53c8050c4a463aff4112ee0756462fd513 WatchSource:0}: Error finding container 0734404a66405d1de3b67fe32832da53c8050c4a463aff4112ee0756462fd513: Status 404 returned error can't find the container with id 0734404a66405d1de3b67fe32832da53c8050c4a463aff4112ee0756462fd513 Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.319132 4610 scope.go:117] "RemoveContainer" containerID="2ebd7ce0315bfb153503159b9036aabb2743d8df7c95bf2ab2d228d3b05456fd" Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.615520 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerStarted","Data":"0734404a66405d1de3b67fe32832da53c8050c4a463aff4112ee0756462fd513"} Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.793113 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.847322 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b7659d8b-729ds"] Oct 06 09:01:10 crc kubenswrapper[4610]: I1006 09:01:10.945037 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84759bdbdc-r6gkv"] Oct 06 09:01:10 crc kubenswrapper[4610]: W1006 09:01:10.952303 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ee29ca_3774_42c0_a3d0_164644f89e7d.slice/crio-ed23c1d7ec96319bfb63b7e4cd416142d74420d46f4f1a20a9b7710b9e11510c WatchSource:0}: Error finding container ed23c1d7ec96319bfb63b7e4cd416142d74420d46f4f1a20a9b7710b9e11510c: Status 404 returned error can't find the container with id ed23c1d7ec96319bfb63b7e4cd416142d74420d46f4f1a20a9b7710b9e11510c Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.629794 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerStarted","Data":"d2aa2ca84700d2c31c6de55ab3ad79e02df9735dbe3dbfeb9c1224ab9fb0e63d"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.632519 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7659d8b-729ds" event={"ID":"a417a1f5-8eba-4d85-9b6e-730463fe2734","Type":"ContainerStarted","Data":"b3c20fb83417b4c5e3c0d1ff6e412549b0a5dbefe9fc5bfe35ee65ea8c7bd887"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.632570 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7659d8b-729ds" event={"ID":"a417a1f5-8eba-4d85-9b6e-730463fe2734","Type":"ContainerStarted","Data":"81546ceb6c96ba36a3ad3b07bed9ee7c7ec145fbd33d817d033922c0b461a158"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.632582 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b7659d8b-729ds" event={"ID":"a417a1f5-8eba-4d85-9b6e-730463fe2734","Type":"ContainerStarted","Data":"78ba4787f2b2a9f35c60669b10afcb2791d3f19c5c285221b084829cc1a644e0"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.632595 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.632638 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.635614 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerStarted","Data":"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.638292 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerStarted","Data":"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.638315 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerStarted","Data":"148517bf2ae786a03b261dccf2f97a4bbf3907b3ee5d082ee9c11ef8806c6811"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.640013 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84759bdbdc-r6gkv" event={"ID":"27ee29ca-3774-42c0-a3d0-164644f89e7d","Type":"ContainerStarted","Data":"897120fbd9de9b64ce0fee060a7424f4768fea8cffc848427ce93a07670fdd79"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.640034 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84759bdbdc-r6gkv" event={"ID":"27ee29ca-3774-42c0-a3d0-164644f89e7d","Type":"ContainerStarted","Data":"ed23c1d7ec96319bfb63b7e4cd416142d74420d46f4f1a20a9b7710b9e11510c"} Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.640489 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:11 crc kubenswrapper[4610]: I1006 09:01:11.663823 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b7659d8b-729ds" podStartSLOduration=5.663806749 podStartE2EDuration="5.663806749s" podCreationTimestamp="2025-10-06 09:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:11.650587527 +0000 UTC m=+1203.365640915" watchObservedRunningTime="2025-10-06 09:01:11.663806749 +0000 UTC m=+1203.378860127" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.648890 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerStarted","Data":"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e"} Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.653574 4610 generic.go:334] "Generic (PLEG): container finished" podID="4408d93d-c733-4032-92fc-df3c6d8d9b0b" containerID="e286f948c1b89a9b3231a2a8a2d32f066aeffb1cfdc87f67ab136f30ff81987b" exitCode=0 Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.653662 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t9gmg" event={"ID":"4408d93d-c733-4032-92fc-df3c6d8d9b0b","Type":"ContainerDied","Data":"e286f948c1b89a9b3231a2a8a2d32f066aeffb1cfdc87f67ab136f30ff81987b"} Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.656934 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerStarted","Data":"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a"} Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.668319 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-84759bdbdc-r6gkv" podStartSLOduration=6.668303677 podStartE2EDuration="6.668303677s" podCreationTimestamp="2025-10-06 09:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:11.683803151 +0000 UTC m=+1203.398856549" watchObservedRunningTime="2025-10-06 09:01:12.668303677 +0000 UTC m=+1204.383357065" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.674668 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.674645077 podStartE2EDuration="20.674645077s" podCreationTimestamp="2025-10-06 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:12.66600856 +0000 UTC m=+1204.381061948" watchObservedRunningTime="2025-10-06 09:01:12.674645077 +0000 UTC m=+1204.389698465" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.705306 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.705290487 podStartE2EDuration="24.705290487s" podCreationTimestamp="2025-10-06 09:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:12.699209984 +0000 UTC m=+1204.414263362" watchObservedRunningTime="2025-10-06 09:01:12.705290487 +0000 UTC m=+1204.420343885" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.822923 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.822969 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.855496 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:01:12 crc kubenswrapper[4610]: I1006 09:01:12.869378 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:01:13 crc kubenswrapper[4610]: I1006 09:01:13.664987 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:01:13 crc kubenswrapper[4610]: I1006 09:01:13.665326 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.083375 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.244589 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config\") pod \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.245679 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb88d\" (UniqueName: \"kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d\") pod \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.246368 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle\") pod \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\" (UID: \"4408d93d-c733-4032-92fc-df3c6d8d9b0b\") " Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.249427 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d" (OuterVolumeSpecName: "kube-api-access-sb88d") pod "4408d93d-c733-4032-92fc-df3c6d8d9b0b" (UID: "4408d93d-c733-4032-92fc-df3c6d8d9b0b"). InnerVolumeSpecName "kube-api-access-sb88d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.269848 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config" (OuterVolumeSpecName: "config") pod "4408d93d-c733-4032-92fc-df3c6d8d9b0b" (UID: "4408d93d-c733-4032-92fc-df3c6d8d9b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.277744 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4408d93d-c733-4032-92fc-df3c6d8d9b0b" (UID: "4408d93d-c733-4032-92fc-df3c6d8d9b0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.348575 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb88d\" (UniqueName: \"kubernetes.io/projected/4408d93d-c733-4032-92fc-df3c6d8d9b0b-kube-api-access-sb88d\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.348614 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.348627 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4408d93d-c733-4032-92fc-df3c6d8d9b0b-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.713401 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t9gmg" event={"ID":"4408d93d-c733-4032-92fc-df3c6d8d9b0b","Type":"ContainerDied","Data":"1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86"} Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.713449 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1444f97f52edb8163cac85e14ce2ea5550977f15ee63c060dba73c0b5dbd1d86" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.713414 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t9gmg" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.716894 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerStarted","Data":"9d886fa85a8b146adb458b1b8c0b64db0c57aeec249e1e63b8833e07e2e2edf3"} Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.717148 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-central-agent" containerID="cri-o://0bd002428c63ea24bde01cc2ebfcc6f6e04db5222fda755c5ee2e873d4a7eb67" gracePeriod=30 Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.717473 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.717827 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="proxy-httpd" containerID="cri-o://9d886fa85a8b146adb458b1b8c0b64db0c57aeec249e1e63b8833e07e2e2edf3" gracePeriod=30 Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.717930 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="sg-core" containerID="cri-o://d2aa2ca84700d2c31c6de55ab3ad79e02df9735dbe3dbfeb9c1224ab9fb0e63d" gracePeriod=30 Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.717992 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-notification-agent" containerID="cri-o://b918cc7af1e0f0820468950e40adf5227e585211f91b5c0034a1d5555a254819" gracePeriod=30 Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.747586 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.747809 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.747902 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.748007 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.753906 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.062503901 podStartE2EDuration="1m4.753876855s" podCreationTimestamp="2025-10-06 09:00:14 +0000 UTC" firstStartedPulling="2025-10-06 09:00:16.45471524 +0000 UTC m=+1148.169768628" lastFinishedPulling="2025-10-06 09:01:18.146088184 +0000 UTC m=+1209.861141582" observedRunningTime="2025-10-06 09:01:18.74571372 +0000 UTC m=+1210.460767128" watchObservedRunningTime="2025-10-06 09:01:18.753876855 +0000 UTC m=+1210.468930263" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.782636 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:18 crc kubenswrapper[4610]: I1006 09:01:18.800401 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.339706 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:19 crc kubenswrapper[4610]: E1006 09:01:19.340241 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4408d93d-c733-4032-92fc-df3c6d8d9b0b" containerName="neutron-db-sync" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.340254 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4408d93d-c733-4032-92fc-df3c6d8d9b0b" containerName="neutron-db-sync" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.340450 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4408d93d-c733-4032-92fc-df3c6d8d9b0b" containerName="neutron-db-sync" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.341294 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.360069 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.461015 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.462489 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.468336 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.468685 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.468894 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.468958 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.469037 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.469101 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.469208 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.469231 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7kj\" (UniqueName: \"kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.470185 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmxdv" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.475031 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.477888 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570692 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570802 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr8jj\" (UniqueName: \"kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570838 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570888 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7kj\" (UniqueName: \"kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570916 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570939 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.570981 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.571008 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.571101 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.571265 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.571396 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.576159 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.577915 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.578117 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.578338 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.578385 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.610789 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7kj\" (UniqueName: \"kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj\") pod \"dnsmasq-dns-5ccc5c4795-k4qm6\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.668392 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.677001 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.677064 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr8jj\" (UniqueName: \"kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.677091 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.677107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.677169 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.681479 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.684441 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.685779 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.703963 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.713768 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr8jj\" (UniqueName: \"kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj\") pod \"neutron-764bc97c84-flcb2\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.755397 4610 generic.go:334] "Generic (PLEG): container finished" podID="cef8c381-61e9-4b18-abe5-d657d9885979" containerID="9d886fa85a8b146adb458b1b8c0b64db0c57aeec249e1e63b8833e07e2e2edf3" exitCode=0 Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.755753 4610 generic.go:334] "Generic (PLEG): container finished" podID="cef8c381-61e9-4b18-abe5-d657d9885979" containerID="d2aa2ca84700d2c31c6de55ab3ad79e02df9735dbe3dbfeb9c1224ab9fb0e63d" exitCode=2 Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.755764 4610 generic.go:334] "Generic (PLEG): container finished" podID="cef8c381-61e9-4b18-abe5-d657d9885979" containerID="0bd002428c63ea24bde01cc2ebfcc6f6e04db5222fda755c5ee2e873d4a7eb67" exitCode=0 Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.756936 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerDied","Data":"9d886fa85a8b146adb458b1b8c0b64db0c57aeec249e1e63b8833e07e2e2edf3"} Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.756974 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerDied","Data":"d2aa2ca84700d2c31c6de55ab3ad79e02df9735dbe3dbfeb9c1224ab9fb0e63d"} Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.756987 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerDied","Data":"0bd002428c63ea24bde01cc2ebfcc6f6e04db5222fda755c5ee2e873d4a7eb67"} Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.776831 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.887365 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:01:19 crc kubenswrapper[4610]: I1006 09:01:19.944506 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.385954 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.572627 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.766004 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerStarted","Data":"9be5b77c267c4a4369669e84ad16461209c2e84eec0f431e36a1108c65d75393"} Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.768792 4610 generic.go:334] "Generic (PLEG): container finished" podID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerID="f25b7b2d92c57c71efcab161f302b4c7bd013835e48afc2212482cd0a612cc34" exitCode=0 Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.768849 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" event={"ID":"098bf04f-4224-4657-b2ab-828f7194a6ea","Type":"ContainerDied","Data":"f25b7b2d92c57c71efcab161f302b4c7bd013835e48afc2212482cd0a612cc34"} Oct 06 09:01:20 crc kubenswrapper[4610]: I1006 09:01:20.768919 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" event={"ID":"098bf04f-4224-4657-b2ab-828f7194a6ea","Type":"ContainerStarted","Data":"5d8485d38957439143e9228b0ed0343091a8979405db176ecc722398698a162b"} Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.780155 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerStarted","Data":"e75f73484809e0670748ddfa9ecabb283cf437fb8d9f21bd30477ffbf9ea81c0"} Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.780446 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerStarted","Data":"caec19a4f32b7d63efc63516b9720214f2bd61b4e25caa1b10e7f9e209953778"} Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.780693 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.782700 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" event={"ID":"098bf04f-4224-4657-b2ab-828f7194a6ea","Type":"ContainerStarted","Data":"5b760dc66259c04d6b71169f424024306a9a9e518ba7af7c73add9d190a1eccc"} Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.782923 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.817555 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-764bc97c84-flcb2" podStartSLOduration=2.817517383 podStartE2EDuration="2.817517383s" podCreationTimestamp="2025-10-06 09:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:21.80823673 +0000 UTC m=+1213.523290158" watchObservedRunningTime="2025-10-06 09:01:21.817517383 +0000 UTC m=+1213.532570781" Oct 06 09:01:21 crc kubenswrapper[4610]: I1006 09:01:21.849200 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" podStartSLOduration=2.849180349 podStartE2EDuration="2.849180349s" podCreationTimestamp="2025-10-06 09:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:21.839150017 +0000 UTC m=+1213.554203415" watchObservedRunningTime="2025-10-06 09:01:21.849180349 +0000 UTC m=+1213.564233737" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.305394 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69d948d6bf-n5vv6"] Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.306770 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.311333 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.311699 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.324117 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d948d6bf-n5vv6"] Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351752 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-ovndb-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351800 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-public-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351832 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-httpd-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351856 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-combined-ca-bundle\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351882 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkt5g\" (UniqueName: \"kubernetes.io/projected/f61e2bff-9119-4208-a7a0-c8da777e049b-kube-api-access-hkt5g\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351907 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.351950 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-internal-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453050 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-public-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453100 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-ovndb-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453128 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-httpd-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453152 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-combined-ca-bundle\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453180 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkt5g\" (UniqueName: \"kubernetes.io/projected/f61e2bff-9119-4208-a7a0-c8da777e049b-kube-api-access-hkt5g\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453206 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.453248 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-internal-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.465080 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-internal-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.465182 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-ovndb-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.465889 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-public-tls-certs\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.467533 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-combined-ca-bundle\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.467827 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-httpd-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.468390 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f61e2bff-9119-4208-a7a0-c8da777e049b-config\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.487769 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkt5g\" (UniqueName: \"kubernetes.io/projected/f61e2bff-9119-4208-a7a0-c8da777e049b-kube-api-access-hkt5g\") pod \"neutron-69d948d6bf-n5vv6\" (UID: \"f61e2bff-9119-4208-a7a0-c8da777e049b\") " pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:22 crc kubenswrapper[4610]: I1006 09:01:22.627136 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.144614 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.387501 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-868f4bc56b-f2np4" Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.521394 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.833809 4610 generic.go:334] "Generic (PLEG): container finished" podID="cef8c381-61e9-4b18-abe5-d657d9885979" containerID="b918cc7af1e0f0820468950e40adf5227e585211f91b5c0034a1d5555a254819" exitCode=0 Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.834021 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon-log" containerID="cri-o://980b61421f9d5498d548880b365ac0717f97ce6a22c27c5f878661d138a2b3b7" gracePeriod=30 Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.834134 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerDied","Data":"b918cc7af1e0f0820468950e40adf5227e585211f91b5c0034a1d5555a254819"} Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.834500 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" containerID="cri-o://44f2e86c3222c2dd45912193c4924309433a91696bc5355fe2a799b451e98b1f" gracePeriod=30 Oct 06 09:01:24 crc kubenswrapper[4610]: I1006 09:01:24.893636 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d948d6bf-n5vv6"] Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.225792 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.421751 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzgcf\" (UniqueName: \"kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.421815 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.421842 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.421955 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.421988 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.422162 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.422193 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle\") pod \"cef8c381-61e9-4b18-abe5-d657d9885979\" (UID: \"cef8c381-61e9-4b18-abe5-d657d9885979\") " Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.426457 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.426718 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.467822 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf" (OuterVolumeSpecName: "kube-api-access-gzgcf") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "kube-api-access-gzgcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.474170 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts" (OuterVolumeSpecName: "scripts") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.474350 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.526745 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.526786 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzgcf\" (UniqueName: \"kubernetes.io/projected/cef8c381-61e9-4b18-abe5-d657d9885979-kube-api-access-gzgcf\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.526802 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.526812 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.526824 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cef8c381-61e9-4b18-abe5-d657d9885979-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.591293 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.624659 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data" (OuterVolumeSpecName: "config-data") pod "cef8c381-61e9-4b18-abe5-d657d9885979" (UID: "cef8c381-61e9-4b18-abe5-d657d9885979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.629960 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.629995 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef8c381-61e9-4b18-abe5-d657d9885979-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.870586 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d948d6bf-n5vv6" event={"ID":"f61e2bff-9119-4208-a7a0-c8da777e049b","Type":"ContainerStarted","Data":"0ff002741b25ef9bde3b6d290a476e1c92b080462c641224c0717cc451c7e7f8"} Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.870888 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d948d6bf-n5vv6" event={"ID":"f61e2bff-9119-4208-a7a0-c8da777e049b","Type":"ContainerStarted","Data":"a3bd094fa72488c86de0a7e5442632040326e74c80b91626cee3308f3a1582b5"} Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.870901 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d948d6bf-n5vv6" event={"ID":"f61e2bff-9119-4208-a7a0-c8da777e049b","Type":"ContainerStarted","Data":"fa44e1dbfb6f5f0e16d774ebc0fa7af95b150b1162cf9cf6939809f8f4af5773"} Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.871234 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.896364 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69d948d6bf-n5vv6" podStartSLOduration=3.896344479 podStartE2EDuration="3.896344479s" podCreationTimestamp="2025-10-06 09:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:25.892835781 +0000 UTC m=+1217.607889189" watchObservedRunningTime="2025-10-06 09:01:25.896344479 +0000 UTC m=+1217.611397867" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.899446 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.899743 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cef8c381-61e9-4b18-abe5-d657d9885979","Type":"ContainerDied","Data":"4c0a443b48154b7bd5dc05c09dd3544c0519842d515a8704dc59bc88031cd6c1"} Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.899794 4610 scope.go:117] "RemoveContainer" containerID="9d886fa85a8b146adb458b1b8c0b64db0c57aeec249e1e63b8833e07e2e2edf3" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.924751 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29xd8" event={"ID":"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5","Type":"ContainerStarted","Data":"f187f0499e60ca9263f4ef445bc5b30abc0ed6aa9779c414a635f7445ffc04f6"} Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.943722 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.954433 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.966950 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-29xd8" podStartSLOduration=18.876663441 podStartE2EDuration="1m0.966930512s" podCreationTimestamp="2025-10-06 09:00:25 +0000 UTC" firstStartedPulling="2025-10-06 09:00:42.395929736 +0000 UTC m=+1174.110983124" lastFinishedPulling="2025-10-06 09:01:24.486196807 +0000 UTC m=+1216.201250195" observedRunningTime="2025-10-06 09:01:25.959032934 +0000 UTC m=+1217.674086322" watchObservedRunningTime="2025-10-06 09:01:25.966930512 +0000 UTC m=+1217.681983900" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.976075 4610 scope.go:117] "RemoveContainer" containerID="d2aa2ca84700d2c31c6de55ab3ad79e02df9735dbe3dbfeb9c1224ab9fb0e63d" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.981895 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:25 crc kubenswrapper[4610]: E1006 09:01:25.982330 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="proxy-httpd" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982346 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="proxy-httpd" Oct 06 09:01:25 crc kubenswrapper[4610]: E1006 09:01:25.982363 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-notification-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982369 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-notification-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: E1006 09:01:25.982397 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-central-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982402 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-central-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: E1006 09:01:25.982414 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="sg-core" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982420 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="sg-core" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982600 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-central-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982614 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="sg-core" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982626 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="ceilometer-notification-agent" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.982636 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" containerName="proxy-httpd" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.984222 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.988229 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:25 crc kubenswrapper[4610]: I1006 09:01:25.990154 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.004852 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.058219 4610 scope.go:117] "RemoveContainer" containerID="b918cc7af1e0f0820468950e40adf5227e585211f91b5c0034a1d5555a254819" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.121980 4610 scope.go:117] "RemoveContainer" containerID="0bd002428c63ea24bde01cc2ebfcc6f6e04db5222fda755c5ee2e873d4a7eb67" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.145068 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r57zd\" (UniqueName: \"kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146260 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146341 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146378 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146409 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146430 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.146446 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247705 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247783 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247818 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247849 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247865 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247882 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.247958 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57zd\" (UniqueName: \"kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.248124 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.248867 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.255953 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.265370 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.265760 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57zd\" (UniqueName: \"kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.276701 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.277156 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data\") pod \"ceilometer-0\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.333188 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.806308 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.934957 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jflcj" event={"ID":"2ca74dbf-7969-4a03-a618-83505fc9c7ec","Type":"ContainerStarted","Data":"8bfc650ceeae3f526c02589de3817799b86fc030d345e9699069dc45c8242cc3"} Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.936124 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerStarted","Data":"1ece93078a18809eb15b470284d8e04f45d37460e5c9c632b5db05c053ed3a1e"} Oct 06 09:01:26 crc kubenswrapper[4610]: I1006 09:01:26.952765 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jflcj" podStartSLOduration=19.819936422 podStartE2EDuration="1m1.952747072s" podCreationTimestamp="2025-10-06 09:00:25 +0000 UTC" firstStartedPulling="2025-10-06 09:00:42.709950966 +0000 UTC m=+1174.425004354" lastFinishedPulling="2025-10-06 09:01:24.842761616 +0000 UTC m=+1216.557815004" observedRunningTime="2025-10-06 09:01:26.950800503 +0000 UTC m=+1218.665853901" watchObservedRunningTime="2025-10-06 09:01:26.952747072 +0000 UTC m=+1218.667800470" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.122161 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef8c381-61e9-4b18-abe5-d657d9885979" path="/var/lib/kubelet/pods/cef8c381-61e9-4b18-abe5-d657d9885979/volumes" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.428203 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.429367 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.429454 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.460828 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.654616 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:01:27 crc kubenswrapper[4610]: I1006 09:01:27.951208 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerStarted","Data":"ad8ec97a6d2bf074ed199d3e6854a04afeb4a19c2fa8cc10c8a52d445dd3e700"} Oct 06 09:01:28 crc kubenswrapper[4610]: I1006 09:01:28.302365 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:38754->10.217.0.151:8443: read: connection reset by peer" Oct 06 09:01:28 crc kubenswrapper[4610]: I1006 09:01:28.966321 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerStarted","Data":"b90c2921ed9406e9fa55ee0930e4263fe37219332d1c6a739991283ec66ee627"} Oct 06 09:01:28 crc kubenswrapper[4610]: I1006 09:01:28.966535 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerStarted","Data":"f717e5f26cf60271be30adf7d9f51a9a861ccf4f3f830ad4bd0246a9ed421fe6"} Oct 06 09:01:28 crc kubenswrapper[4610]: I1006 09:01:28.981248 4610 generic.go:334] "Generic (PLEG): container finished" podID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerID="44f2e86c3222c2dd45912193c4924309433a91696bc5355fe2a799b451e98b1f" exitCode=0 Oct 06 09:01:28 crc kubenswrapper[4610]: I1006 09:01:28.981290 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerDied","Data":"44f2e86c3222c2dd45912193c4924309433a91696bc5355fe2a799b451e98b1f"} Oct 06 09:01:29 crc kubenswrapper[4610]: I1006 09:01:29.669200 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:29 crc kubenswrapper[4610]: I1006 09:01:29.762709 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:01:29 crc kubenswrapper[4610]: I1006 09:01:29.762966 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="dnsmasq-dns" containerID="cri-o://c3d7947c6c4d4523abf33e6ffa5955cea85b7a462b1750f1eed9df7592f2c7e1" gracePeriod=10 Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.002200 4610 generic.go:334] "Generic (PLEG): container finished" podID="c5571882-ca5c-46a2-9598-377e7c779036" containerID="c3d7947c6c4d4523abf33e6ffa5955cea85b7a462b1750f1eed9df7592f2c7e1" exitCode=0 Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.002549 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" event={"ID":"c5571882-ca5c-46a2-9598-377e7c779036","Type":"ContainerDied","Data":"c3d7947c6c4d4523abf33e6ffa5955cea85b7a462b1750f1eed9df7592f2c7e1"} Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.471150 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551163 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551234 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551283 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551304 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nw6\" (UniqueName: \"kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551368 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.551462 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc\") pod \"c5571882-ca5c-46a2-9598-377e7c779036\" (UID: \"c5571882-ca5c-46a2-9598-377e7c779036\") " Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.556796 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6" (OuterVolumeSpecName: "kube-api-access-48nw6") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "kube-api-access-48nw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.609816 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config" (OuterVolumeSpecName: "config") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.617067 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.636536 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.654798 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.655007 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.655019 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nw6\" (UniqueName: \"kubernetes.io/projected/c5571882-ca5c-46a2-9598-377e7c779036-kube-api-access-48nw6\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.655028 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.665560 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.669400 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5571882-ca5c-46a2-9598-377e7c779036" (UID: "c5571882-ca5c-46a2-9598-377e7c779036"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.757258 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:30 crc kubenswrapper[4610]: I1006 09:01:30.757295 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5571882-ca5c-46a2-9598-377e7c779036-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.013517 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerStarted","Data":"d0cff3adc5fbd03695f5bd42ce8e4739fcdd77785b619e0ef57573894281ee26"} Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.014698 4610 generic.go:334] "Generic (PLEG): container finished" podID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" containerID="f187f0499e60ca9263f4ef445bc5b30abc0ed6aa9779c414a635f7445ffc04f6" exitCode=0 Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.014760 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29xd8" event={"ID":"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5","Type":"ContainerDied","Data":"f187f0499e60ca9263f4ef445bc5b30abc0ed6aa9779c414a635f7445ffc04f6"} Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.016331 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" event={"ID":"c5571882-ca5c-46a2-9598-377e7c779036","Type":"ContainerDied","Data":"26395376bb0ec542afde54f1d520810d881dea2483e19863b19a776dea64162e"} Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.016359 4610 scope.go:117] "RemoveContainer" containerID="c3d7947c6c4d4523abf33e6ffa5955cea85b7a462b1750f1eed9df7592f2c7e1" Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.016401 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-lrn2q" Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.049398 4610 scope.go:117] "RemoveContainer" containerID="197833cb4fe91c18b70378d742c893f770918d4a189d37611bc9c02c2daa2ba2" Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.053428 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.517851821 podStartE2EDuration="6.053382715s" podCreationTimestamp="2025-10-06 09:01:25 +0000 UTC" firstStartedPulling="2025-10-06 09:01:26.808916378 +0000 UTC m=+1218.523969766" lastFinishedPulling="2025-10-06 09:01:30.344447272 +0000 UTC m=+1222.059500660" observedRunningTime="2025-10-06 09:01:31.04678315 +0000 UTC m=+1222.761836548" watchObservedRunningTime="2025-10-06 09:01:31.053382715 +0000 UTC m=+1222.768436123" Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.084245 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:01:31 crc kubenswrapper[4610]: I1006 09:01:31.084287 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-lrn2q"] Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.031057 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.354355 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29xd8" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.485368 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data\") pod \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.485545 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle\") pod \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.485643 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsqfm\" (UniqueName: \"kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm\") pod \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\" (UID: \"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5\") " Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.500987 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm" (OuterVolumeSpecName: "kube-api-access-jsqfm") pod "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" (UID: "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5"). InnerVolumeSpecName "kube-api-access-jsqfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.505874 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" (UID: "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.520400 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" (UID: "32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.587884 4610 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.587915 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:32 crc kubenswrapper[4610]: I1006 09:01:32.587924 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsqfm\" (UniqueName: \"kubernetes.io/projected/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5-kube-api-access-jsqfm\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.039294 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29xd8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.045573 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29xd8" event={"ID":"32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5","Type":"ContainerDied","Data":"d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6"} Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.045630 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3775f9894673dd6073c07ca1f0b32cbf7d361a85ad17d066c63ff4288f050c6" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.083949 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5571882-ca5c-46a2-9598-377e7c779036" path="/var/lib/kubelet/pods/c5571882-ca5c-46a2-9598-377e7c779036/volumes" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.347748 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-79f9c7c798-mtvls"] Oct 06 09:01:33 crc kubenswrapper[4610]: E1006 09:01:33.348193 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" containerName="barbican-db-sync" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.348215 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" containerName="barbican-db-sync" Oct 06 09:01:33 crc kubenswrapper[4610]: E1006 09:01:33.348240 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="init" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.348249 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="init" Oct 06 09:01:33 crc kubenswrapper[4610]: E1006 09:01:33.348279 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="dnsmasq-dns" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.348287 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="dnsmasq-dns" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.348537 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5571882-ca5c-46a2-9598-377e7c779036" containerName="dnsmasq-dns" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.348560 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" containerName="barbican-db-sync" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.360109 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.363748 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79f9c7c798-mtvls"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.366541 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.366934 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f75p4" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.367800 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.487360 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-776cb49575-gr6gq"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.494217 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.504199 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.550723 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfzhp\" (UniqueName: \"kubernetes.io/projected/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-kube-api-access-jfzhp\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.550803 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.550831 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-combined-ca-bundle\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.550892 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data-custom\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.550928 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-logs\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.563764 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-776cb49575-gr6gq"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.577583 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.580499 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.598822 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.615196 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.616791 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.624271 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.632615 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654619 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfzhp\" (UniqueName: \"kubernetes.io/projected/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-kube-api-access-jfzhp\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654683 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data-custom\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654716 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654737 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-combined-ca-bundle\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654765 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5kp\" (UniqueName: \"kubernetes.io/projected/8e13d8e7-f118-4d18-ab69-162dadc7f649-kube-api-access-gd5kp\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654787 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654818 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e13d8e7-f118-4d18-ab69-162dadc7f649-logs\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654844 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data-custom\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654879 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-logs\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.654910 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-combined-ca-bundle\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.656440 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-logs\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.665440 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-combined-ca-bundle\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.667473 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.684986 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-config-data-custom\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.703860 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfzhp\" (UniqueName: \"kubernetes.io/projected/9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d-kube-api-access-jfzhp\") pod \"barbican-keystone-listener-79f9c7c798-mtvls\" (UID: \"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d\") " pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756040 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756120 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756163 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-combined-ca-bundle\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756187 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756208 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756222 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756244 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756262 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data-custom\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756293 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvnm\" (UniqueName: \"kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756311 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756328 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2g8t\" (UniqueName: \"kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756347 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756369 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5kp\" (UniqueName: \"kubernetes.io/projected/8e13d8e7-f118-4d18-ab69-162dadc7f649-kube-api-access-gd5kp\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756391 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756425 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.756442 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e13d8e7-f118-4d18-ab69-162dadc7f649-logs\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.758387 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e13d8e7-f118-4d18-ab69-162dadc7f649-logs\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.761639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-combined-ca-bundle\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.762214 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data-custom\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.771703 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e13d8e7-f118-4d18-ab69-162dadc7f649-config-data\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.778334 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5kp\" (UniqueName: \"kubernetes.io/projected/8e13d8e7-f118-4d18-ab69-162dadc7f649-kube-api-access-gd5kp\") pod \"barbican-worker-776cb49575-gr6gq\" (UID: \"8e13d8e7-f118-4d18-ab69-162dadc7f649\") " pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.850533 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-776cb49575-gr6gq" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.858598 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvnm\" (UniqueName: \"kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.859082 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.860070 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.859111 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2g8t\" (UniqueName: \"kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.860169 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.860902 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861084 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861674 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861711 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861790 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861877 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861900 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.861946 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.862419 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.862614 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.863430 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.863807 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.866727 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.873333 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.879453 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.892750 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvnm\" (UniqueName: \"kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm\") pod \"barbican-api-54b57c9f78-lbmlj\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.892795 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2g8t\" (UniqueName: \"kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t\") pod \"dnsmasq-dns-688c87cc99-9x6x8\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.905431 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.933017 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:33 crc kubenswrapper[4610]: I1006 09:01:33.980748 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" Oct 06 09:01:34 crc kubenswrapper[4610]: W1006 09:01:34.433246 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e13d8e7_f118_4d18_ab69_162dadc7f649.slice/crio-983accab9ae9c41a9e4db8979388a692656578582cd9df2cd622dd33fc412577 WatchSource:0}: Error finding container 983accab9ae9c41a9e4db8979388a692656578582cd9df2cd622dd33fc412577: Status 404 returned error can't find the container with id 983accab9ae9c41a9e4db8979388a692656578582cd9df2cd622dd33fc412577 Oct 06 09:01:34 crc kubenswrapper[4610]: I1006 09:01:34.440292 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-776cb49575-gr6gq"] Oct 06 09:01:34 crc kubenswrapper[4610]: I1006 09:01:34.613604 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:01:34 crc kubenswrapper[4610]: I1006 09:01:34.770661 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79f9c7c798-mtvls"] Oct 06 09:01:34 crc kubenswrapper[4610]: I1006 09:01:34.793923 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.084281 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerStarted","Data":"6f2c41817a62375df35be86b9797fb05a7fd706c6e01cb42f5ef271d8320a9e6"} Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.084794 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerStarted","Data":"57b26f036b30d9f0530fbbe1725c89405553740a907f9c5dabec8add3a78c9d3"} Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.084867 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" event={"ID":"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d","Type":"ContainerStarted","Data":"d59b71b34e0f3b342b95020b78b92aa7f650ad676d6f8a03de08a028f34c83b0"} Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.084941 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerStarted","Data":"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac"} Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.085002 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerStarted","Data":"b828ad1f10fb48658f9fcd0cbb5086b39650c49b5283d5add8880631d83b6b7c"} Oct 06 09:01:35 crc kubenswrapper[4610]: I1006 09:01:35.085159 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776cb49575-gr6gq" event={"ID":"8e13d8e7-f118-4d18-ab69-162dadc7f649","Type":"ContainerStarted","Data":"983accab9ae9c41a9e4db8979388a692656578582cd9df2cd622dd33fc412577"} Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.103771 4610 generic.go:334] "Generic (PLEG): container finished" podID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerID="c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac" exitCode=0 Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.104026 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerDied","Data":"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac"} Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.104066 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerStarted","Data":"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2"} Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.104803 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.116940 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerStarted","Data":"e54199d7ed64789b07edfaa8797f5c347ec8f27d3c16ad9fbbae7840a6239288"} Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.117663 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.117687 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.133138 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" podStartSLOduration=3.133122421 podStartE2EDuration="3.133122421s" podCreationTimestamp="2025-10-06 09:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:36.13032823 +0000 UTC m=+1227.845381618" watchObservedRunningTime="2025-10-06 09:01:36.133122421 +0000 UTC m=+1227.848175809" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.151610 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54b57c9f78-lbmlj" podStartSLOduration=3.151594925 podStartE2EDuration="3.151594925s" podCreationTimestamp="2025-10-06 09:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:36.146334663 +0000 UTC m=+1227.861388051" watchObservedRunningTime="2025-10-06 09:01:36.151594925 +0000 UTC m=+1227.866648313" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.754404 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6849467754-2xpgn"] Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.756778 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.759745 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.759929 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.790062 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6849467754-2xpgn"] Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.922955 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-combined-ca-bundle\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.922999 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.923021 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data-custom\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.923058 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-internal-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.923097 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-logs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.923121 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22cg\" (UniqueName: \"kubernetes.io/projected/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-kube-api-access-q22cg\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.923142 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-public-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:36 crc kubenswrapper[4610]: I1006 09:01:36.925882 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.025175 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-internal-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.025499 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-logs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.025644 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22cg\" (UniqueName: \"kubernetes.io/projected/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-kube-api-access-q22cg\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.025764 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-public-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.025905 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-logs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.026089 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-combined-ca-bundle\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.026204 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.026314 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data-custom\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.036858 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-internal-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.040091 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.040627 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-config-data-custom\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.063241 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-combined-ca-bundle\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.063537 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-public-tls-certs\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.083867 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22cg\" (UniqueName: \"kubernetes.io/projected/8416ea1e-d79e-4dc2-8902-d59c8c4bbc60-kube-api-access-q22cg\") pod \"barbican-api-6849467754-2xpgn\" (UID: \"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60\") " pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.127234 4610 generic.go:334] "Generic (PLEG): container finished" podID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" containerID="8bfc650ceeae3f526c02589de3817799b86fc030d345e9699069dc45c8242cc3" exitCode=0 Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.127551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jflcj" event={"ID":"2ca74dbf-7969-4a03-a618-83505fc9c7ec","Type":"ContainerDied","Data":"8bfc650ceeae3f526c02589de3817799b86fc030d345e9699069dc45c8242cc3"} Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.373904 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:37 crc kubenswrapper[4610]: I1006 09:01:37.890036 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6849467754-2xpgn"] Oct 06 09:01:37 crc kubenswrapper[4610]: W1006 09:01:37.912282 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8416ea1e_d79e_4dc2_8902_d59c8c4bbc60.slice/crio-a20235034952188dfcc5b78b4893855a76ec264042d0323e8e27f389981a58e0 WatchSource:0}: Error finding container a20235034952188dfcc5b78b4893855a76ec264042d0323e8e27f389981a58e0: Status 404 returned error can't find the container with id a20235034952188dfcc5b78b4893855a76ec264042d0323e8e27f389981a58e0 Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.181430 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" event={"ID":"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d","Type":"ContainerStarted","Data":"4d94b904e717db7ee2d0f26b9de302222386e14093d82ea9b71b5a38424d6167"} Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.181751 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" event={"ID":"9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d","Type":"ContainerStarted","Data":"5fd5626b36a06b02d1b620b8fde80e2fb3f0dba84f181007d6217f5b42e9f478"} Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.190255 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776cb49575-gr6gq" event={"ID":"8e13d8e7-f118-4d18-ab69-162dadc7f649","Type":"ContainerStarted","Data":"bc5527cadffbcf92e6e39583071e1411cefc645af6c6d7ddcb4158ad4df05a46"} Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.190290 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776cb49575-gr6gq" event={"ID":"8e13d8e7-f118-4d18-ab69-162dadc7f649","Type":"ContainerStarted","Data":"c6d36e6f951fe0da74530512b94dbe4c92e7bba9557db8a6d70faa7604b17133"} Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.224461 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6849467754-2xpgn" event={"ID":"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60","Type":"ContainerStarted","Data":"a20235034952188dfcc5b78b4893855a76ec264042d0323e8e27f389981a58e0"} Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.226383 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-79f9c7c798-mtvls" podStartSLOduration=3.226341512 podStartE2EDuration="5.226359656s" podCreationTimestamp="2025-10-06 09:01:33 +0000 UTC" firstStartedPulling="2025-10-06 09:01:34.798492176 +0000 UTC m=+1226.513545564" lastFinishedPulling="2025-10-06 09:01:36.79851032 +0000 UTC m=+1228.513563708" observedRunningTime="2025-10-06 09:01:38.206008374 +0000 UTC m=+1229.921061772" watchObservedRunningTime="2025-10-06 09:01:38.226359656 +0000 UTC m=+1229.941413044" Oct 06 09:01:38 crc kubenswrapper[4610]: I1006 09:01:38.233043 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-776cb49575-gr6gq" podStartSLOduration=2.883414896 podStartE2EDuration="5.233024053s" podCreationTimestamp="2025-10-06 09:01:33 +0000 UTC" firstStartedPulling="2025-10-06 09:01:34.445431995 +0000 UTC m=+1226.160485383" lastFinishedPulling="2025-10-06 09:01:36.795041152 +0000 UTC m=+1228.510094540" observedRunningTime="2025-10-06 09:01:38.230275954 +0000 UTC m=+1229.945329362" watchObservedRunningTime="2025-10-06 09:01:38.233024053 +0000 UTC m=+1229.948077441" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.788672 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jflcj" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793537 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793590 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793645 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793665 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793699 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.793720 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.794065 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9cx\" (UniqueName: \"kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx\") pod \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\" (UID: \"2ca74dbf-7969-4a03-a618-83505fc9c7ec\") " Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.794334 4610 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca74dbf-7969-4a03-a618-83505fc9c7ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.797454 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts" (OuterVolumeSpecName: "scripts") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.801207 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx" (OuterVolumeSpecName: "kube-api-access-wm9cx") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "kube-api-access-wm9cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.804001 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.896033 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9cx\" (UniqueName: \"kubernetes.io/projected/2ca74dbf-7969-4a03-a618-83505fc9c7ec-kube-api-access-wm9cx\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.896078 4610 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.896090 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.903219 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.978240 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data" (OuterVolumeSpecName: "config-data") pod "2ca74dbf-7969-4a03-a618-83505fc9c7ec" (UID: "2ca74dbf-7969-4a03-a618-83505fc9c7ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.997311 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:38.997338 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca74dbf-7969-4a03-a618-83505fc9c7ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.279185 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6849467754-2xpgn" event={"ID":"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60","Type":"ContainerStarted","Data":"e4ac32d303da1d090811da710104addfa5debaf4a3a4c7f9169099c1faf924b8"} Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.286852 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jflcj" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.287347 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jflcj" event={"ID":"2ca74dbf-7969-4a03-a618-83505fc9c7ec","Type":"ContainerDied","Data":"f7866e176176a77e9dc26cb0e59f11edc9e033c3851ad535b85050a7e3191c2b"} Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.287368 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7866e176176a77e9dc26cb0e59f11edc9e033c3851ad535b85050a7e3191c2b" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.382259 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:01:39 crc kubenswrapper[4610]: E1006 09:01:39.382583 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" containerName="cinder-db-sync" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.382600 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" containerName="cinder-db-sync" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.382782 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" containerName="cinder-db-sync" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.383661 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.390670 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n464c" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.390843 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.390872 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.391700 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.416190 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.493388 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.496186 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="dnsmasq-dns" containerID="cri-o://2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2" gracePeriod=10 Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.517791 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.517853 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.517880 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.517908 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.517965 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.518009 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26tb\" (UniqueName: \"kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.540687 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.546621 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.584228 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620286 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620335 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620385 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620411 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620441 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620463 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l26tb\" (UniqueName: \"kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620500 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620524 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620543 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620558 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620575 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqh5s\" (UniqueName: \"kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620602 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.620666 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.630247 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.634145 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.634598 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.642647 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.679454 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26tb\" (UniqueName: \"kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb\") pod \"cinder-scheduler-0\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.730490 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.731952 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.732019 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.732070 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.732099 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.732123 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqh5s\" (UniqueName: \"kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.732406 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.733256 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.733750 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.734408 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.735383 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.738581 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.752924 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.776278 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqh5s\" (UniqueName: \"kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s\") pod \"dnsmasq-dns-6bb4fc677f-k4x5f\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.780541 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.780657 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.783529 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940204 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940319 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf5lr\" (UniqueName: \"kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940347 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940389 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940418 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940435 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:39 crc kubenswrapper[4610]: I1006 09:01:39.940449 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.015405 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.041983 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.045818 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051279 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf5lr\" (UniqueName: \"kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051356 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051430 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051481 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051505 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.051528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.052037 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.062539 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.066354 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.066794 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.069170 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.091608 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf5lr\" (UniqueName: \"kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr\") pod \"cinder-api-0\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.133943 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.256073 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.309225 4610 generic.go:334] "Generic (PLEG): container finished" podID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerID="2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2" exitCode=0 Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.309288 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerDied","Data":"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2"} Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.309315 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" event={"ID":"2266d633-3c38-449f-8d38-bcb7c03d9871","Type":"ContainerDied","Data":"b828ad1f10fb48658f9fcd0cbb5086b39650c49b5283d5add8880631d83b6b7c"} Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.309332 4610 scope.go:117] "RemoveContainer" containerID="2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.309452 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-9x6x8" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.319177 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6849467754-2xpgn" event={"ID":"8416ea1e-d79e-4dc2-8902-d59c8c4bbc60","Type":"ContainerStarted","Data":"dea1f9ee681261b23532e4a2bcdb8e2576d38a42851dee420dac9b424f560fe3"} Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.320453 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.320486 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.359362 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.359649 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2g8t\" (UniqueName: \"kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.359739 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.359763 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.359794 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.360090 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.381765 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t" (OuterVolumeSpecName: "kube-api-access-s2g8t") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "kube-api-access-s2g8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.433212 4610 scope.go:117] "RemoveContainer" containerID="c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.481200 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.481269 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2g8t\" (UniqueName: \"kubernetes.io/projected/2266d633-3c38-449f-8d38-bcb7c03d9871-kube-api-access-s2g8t\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.568883 4610 scope.go:117] "RemoveContainer" containerID="2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2" Oct 06 09:01:40 crc kubenswrapper[4610]: E1006 09:01:40.572758 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2\": container with ID starting with 2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2 not found: ID does not exist" containerID="2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.572815 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2"} err="failed to get container status \"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2\": rpc error: code = NotFound desc = could not find container \"2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2\": container with ID starting with 2dc61ae127dfa38b5f020b1a7a36c14d656cc4407746c3a12c765155e1175ce2 not found: ID does not exist" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.572844 4610 scope.go:117] "RemoveContainer" containerID="c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.589210 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: W1006 09:01:40.589550 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478966be_8437_4700_9ee1_6692e4ef7a1e.slice/crio-234be4e7e6193e6aa415e97b88ef92e7077b6711207bc62ca43c47c3674e35f6 WatchSource:0}: Error finding container 234be4e7e6193e6aa415e97b88ef92e7077b6711207bc62ca43c47c3674e35f6: Status 404 returned error can't find the container with id 234be4e7e6193e6aa415e97b88ef92e7077b6711207bc62ca43c47c3674e35f6 Oct 06 09:01:40 crc kubenswrapper[4610]: E1006 09:01:40.589770 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac\": container with ID starting with c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac not found: ID does not exist" containerID="c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.589806 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac"} err="failed to get container status \"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac\": rpc error: code = NotFound desc = could not find container \"c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac\": container with ID starting with c017e9faab33c3e40cadbfd34bff3cef74b468819a1e46c78295a88671952dac not found: ID does not exist" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.610331 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6849467754-2xpgn" podStartSLOduration=4.610309776 podStartE2EDuration="4.610309776s" podCreationTimestamp="2025-10-06 09:01:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:40.346626101 +0000 UTC m=+1232.061679489" watchObservedRunningTime="2025-10-06 09:01:40.610309776 +0000 UTC m=+1232.325363174" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.623807 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.637616 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.657770 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config" (OuterVolumeSpecName: "config") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.693366 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.693455 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") pod \"2266d633-3c38-449f-8d38-bcb7c03d9871\" (UID: \"2266d633-3c38-449f-8d38-bcb7c03d9871\") " Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.693816 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.693850 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: W1006 09:01:40.693929 4610 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2266d633-3c38-449f-8d38-bcb7c03d9871/volumes/kubernetes.io~configmap/dns-swift-storage-0 Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.693940 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.698241 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2266d633-3c38-449f-8d38-bcb7c03d9871" (UID: "2266d633-3c38-449f-8d38-bcb7c03d9871"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.796320 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.796375 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2266d633-3c38-449f-8d38-bcb7c03d9871-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.973957 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.984962 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-9x6x8"] Oct 06 09:01:40 crc kubenswrapper[4610]: I1006 09:01:40.993193 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:01:41 crc kubenswrapper[4610]: W1006 09:01:41.003189 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5490ccd6_2c49_47b6_a3e0_9d068b0080b4.slice/crio-f58ec056b7721a38ec1a755b4261df5a78540fb1cc7c5d980093503ba26097fe WatchSource:0}: Error finding container f58ec056b7721a38ec1a755b4261df5a78540fb1cc7c5d980093503ba26097fe: Status 404 returned error can't find the container with id f58ec056b7721a38ec1a755b4261df5a78540fb1cc7c5d980093503ba26097fe Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.083334 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" path="/var/lib/kubelet/pods/2266d633-3c38-449f-8d38-bcb7c03d9871/volumes" Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.130524 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.262331 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.330894 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" event={"ID":"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9","Type":"ContainerStarted","Data":"25e25665788695f95d2b8fc6c73d6a29d02b99fec812431c8b79d86a3e152154"} Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.340384 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerStarted","Data":"f58ec056b7721a38ec1a755b4261df5a78540fb1cc7c5d980093503ba26097fe"} Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.348546 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerStarted","Data":"234be4e7e6193e6aa415e97b88ef92e7077b6711207bc62ca43c47c3674e35f6"} Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.422480 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-84759bdbdc-r6gkv" Oct 06 09:01:41 crc kubenswrapper[4610]: I1006 09:01:41.542841 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b7659d8b-729ds" Oct 06 09:01:42 crc kubenswrapper[4610]: I1006 09:01:42.367323 4610 generic.go:334] "Generic (PLEG): container finished" podID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerID="4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc" exitCode=0 Oct 06 09:01:42 crc kubenswrapper[4610]: I1006 09:01:42.367404 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" event={"ID":"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9","Type":"ContainerDied","Data":"4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc"} Oct 06 09:01:42 crc kubenswrapper[4610]: I1006 09:01:42.384879 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerStarted","Data":"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5"} Oct 06 09:01:42 crc kubenswrapper[4610]: I1006 09:01:42.767653 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.401164 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerStarted","Data":"4c63c21000cf6c5148e61323072ed04670dad864343aa3332d1f91bea47ba69d"} Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.404909 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" event={"ID":"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9","Type":"ContainerStarted","Data":"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163"} Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.405087 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.408742 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerStarted","Data":"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7"} Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.408866 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api-log" containerID="cri-o://57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5" gracePeriod=30 Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.408947 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.408987 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api" containerID="cri-o://2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7" gracePeriod=30 Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.430743 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" podStartSLOduration=4.430725182 podStartE2EDuration="4.430725182s" podCreationTimestamp="2025-10-06 09:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:43.41909907 +0000 UTC m=+1235.134152468" watchObservedRunningTime="2025-10-06 09:01:43.430725182 +0000 UTC m=+1235.145778560" Oct 06 09:01:43 crc kubenswrapper[4610]: I1006 09:01:43.457303 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.45728382 podStartE2EDuration="4.45728382s" podCreationTimestamp="2025-10-06 09:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:01:43.439686377 +0000 UTC m=+1235.154739785" watchObservedRunningTime="2025-10-06 09:01:43.45728382 +0000 UTC m=+1235.172337208" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.419936 4610 generic.go:334] "Generic (PLEG): container finished" podID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerID="57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5" exitCode=143 Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.421108 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerDied","Data":"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5"} Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.423396 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerStarted","Data":"4e1d24942df3ea5781dea73d59cca7378ab9f3a7dc1034c32e38b1cf4597de46"} Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.445971 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.617095456 podStartE2EDuration="5.445954531s" podCreationTimestamp="2025-10-06 09:01:39 +0000 UTC" firstStartedPulling="2025-10-06 09:01:40.619420375 +0000 UTC m=+1232.334473763" lastFinishedPulling="2025-10-06 09:01:41.44827945 +0000 UTC m=+1233.163332838" observedRunningTime="2025-10-06 09:01:44.441670953 +0000 UTC m=+1236.156724381" watchObservedRunningTime="2025-10-06 09:01:44.445954531 +0000 UTC m=+1236.161007919" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.607938 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 09:01:44 crc kubenswrapper[4610]: E1006 09:01:44.608849 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="init" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.608879 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="init" Oct 06 09:01:44 crc kubenswrapper[4610]: E1006 09:01:44.608953 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="dnsmasq-dns" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.608964 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="dnsmasq-dns" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.609221 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="2266d633-3c38-449f-8d38-bcb7c03d9871" containerName="dnsmasq-dns" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.610094 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.612616 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.612924 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x5njj" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.613338 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.618461 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.731231 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.772458 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gjd5\" (UniqueName: \"kubernetes.io/projected/5b7473c8-fdfd-426a-99da-57bc4175e303-kube-api-access-6gjd5\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.772523 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.772770 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.772958 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.875119 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gjd5\" (UniqueName: \"kubernetes.io/projected/5b7473c8-fdfd-426a-99da-57bc4175e303-kube-api-access-6gjd5\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.875181 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.875221 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.875264 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.876196 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.881476 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.881935 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7473c8-fdfd-426a-99da-57bc4175e303-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.894340 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gjd5\" (UniqueName: \"kubernetes.io/projected/5b7473c8-fdfd-426a-99da-57bc4175e303-kube-api-access-6gjd5\") pod \"openstackclient\" (UID: \"5b7473c8-fdfd-426a-99da-57bc4175e303\") " pod="openstack/openstackclient" Oct 06 09:01:44 crc kubenswrapper[4610]: I1006 09:01:44.932118 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 09:01:45 crc kubenswrapper[4610]: I1006 09:01:45.548234 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 09:01:45 crc kubenswrapper[4610]: W1006 09:01:45.559227 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b7473c8_fdfd_426a_99da_57bc4175e303.slice/crio-0d7813294b934f51ff65ccbe9e15b70e6bac684796655656364d08e333d42563 WatchSource:0}: Error finding container 0d7813294b934f51ff65ccbe9e15b70e6bac684796655656364d08e333d42563: Status 404 returned error can't find the container with id 0d7813294b934f51ff65ccbe9e15b70e6bac684796655656364d08e333d42563 Oct 06 09:01:46 crc kubenswrapper[4610]: I1006 09:01:46.408186 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:46 crc kubenswrapper[4610]: I1006 09:01:46.464757 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b7473c8-fdfd-426a-99da-57bc4175e303","Type":"ContainerStarted","Data":"0d7813294b934f51ff65ccbe9e15b70e6bac684796655656364d08e333d42563"} Oct 06 09:01:46 crc kubenswrapper[4610]: I1006 09:01:46.729658 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:01:46 crc kubenswrapper[4610]: I1006 09:01:46.915618 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 06 09:01:46 crc kubenswrapper[4610]: I1006 09:01:46.915958 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:01:49 crc kubenswrapper[4610]: I1006 09:01:49.733748 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.166:8080/\": dial tcp 10.217.0.166:8080: connect: connection refused" Oct 06 09:01:49 crc kubenswrapper[4610]: I1006 09:01:49.826660 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:01:50 crc kubenswrapper[4610]: I1006 09:01:50.016549 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:01:50 crc kubenswrapper[4610]: I1006 09:01:50.121895 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:50 crc kubenswrapper[4610]: I1006 09:01:50.122375 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="dnsmasq-dns" containerID="cri-o://5b760dc66259c04d6b71169f424024306a9a9e518ba7af7c73add9d190a1eccc" gracePeriod=10 Oct 06 09:01:50 crc kubenswrapper[4610]: I1006 09:01:50.518721 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:50 crc kubenswrapper[4610]: I1006 09:01:50.708147 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-rmk9w" podUID="0f88e64c-929a-4a97-a3a1-a92face17060" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:01:51 crc kubenswrapper[4610]: I1006 09:01:51.547386 4610 generic.go:334] "Generic (PLEG): container finished" podID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerID="5b760dc66259c04d6b71169f424024306a9a9e518ba7af7c73add9d190a1eccc" exitCode=0 Oct 06 09:01:51 crc kubenswrapper[4610]: I1006 09:01:51.547473 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" event={"ID":"098bf04f-4224-4657-b2ab-828f7194a6ea","Type":"ContainerDied","Data":"5b760dc66259c04d6b71169f424024306a9a9e518ba7af7c73add9d190a1eccc"} Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.129760 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.137672 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6849467754-2xpgn" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.213688 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.213906 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" containerID="cri-o://6f2c41817a62375df35be86b9797fb05a7fd706c6e01cb42f5ef271d8320a9e6" gracePeriod=30 Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.214006 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" containerID="cri-o://e54199d7ed64789b07edfaa8797f5c347ec8f27d3c16ad9fbbae7840a6239288" gracePeriod=30 Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.217716 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.217784 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.217907 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.217963 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.217982 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.218075 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp7kj\" (UniqueName: \"kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj\") pod \"098bf04f-4224-4657-b2ab-828f7194a6ea\" (UID: \"098bf04f-4224-4657-b2ab-828f7194a6ea\") " Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.228378 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.239657 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj" (OuterVolumeSpecName: "kube-api-access-wp7kj") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "kube-api-access-wp7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.333153 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp7kj\" (UniqueName: \"kubernetes.io/projected/098bf04f-4224-4657-b2ab-828f7194a6ea-kube-api-access-wp7kj\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.405892 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.423906 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.434684 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.434709 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.434881 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.442868 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.462697 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config" (OuterVolumeSpecName: "config") pod "098bf04f-4224-4657-b2ab-828f7194a6ea" (UID: "098bf04f-4224-4657-b2ab-828f7194a6ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.536713 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.536909 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.536982 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/098bf04f-4224-4657-b2ab-828f7194a6ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.562650 4610 generic.go:334] "Generic (PLEG): container finished" podID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerID="6f2c41817a62375df35be86b9797fb05a7fd706c6e01cb42f5ef271d8320a9e6" exitCode=143 Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.562854 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerDied","Data":"6f2c41817a62375df35be86b9797fb05a7fd706c6e01cb42f5ef271d8320a9e6"} Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.566023 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" event={"ID":"098bf04f-4224-4657-b2ab-828f7194a6ea","Type":"ContainerDied","Data":"5d8485d38957439143e9228b0ed0343091a8979405db176ecc722398698a162b"} Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.567180 4610 scope.go:117] "RemoveContainer" containerID="5b760dc66259c04d6b71169f424024306a9a9e518ba7af7c73add9d190a1eccc" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.566357 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k4qm6" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.607172 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.618503 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k4qm6"] Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.621364 4610 scope.go:117] "RemoveContainer" containerID="f25b7b2d92c57c71efcab161f302b4c7bd013835e48afc2212482cd0a612cc34" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.658776 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69d948d6bf-n5vv6" Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.726362 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.726867 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-764bc97c84-flcb2" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-api" containerID="cri-o://caec19a4f32b7d63efc63516b9720214f2bd61b4e25caa1b10e7f9e209953778" gracePeriod=30 Oct 06 09:01:52 crc kubenswrapper[4610]: I1006 09:01:52.727405 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-764bc97c84-flcb2" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-httpd" containerID="cri-o://e75f73484809e0670748ddfa9ecabb283cf437fb8d9f21bd30477ffbf9ea81c0" gracePeriod=30 Oct 06 09:01:53 crc kubenswrapper[4610]: I1006 09:01:53.089032 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" path="/var/lib/kubelet/pods/098bf04f-4224-4657-b2ab-828f7194a6ea/volumes" Oct 06 09:01:53 crc kubenswrapper[4610]: I1006 09:01:53.590297 4610 generic.go:334] "Generic (PLEG): container finished" podID="96270269-2a63-42c6-a881-36aba28d88ae" containerID="e75f73484809e0670748ddfa9ecabb283cf437fb8d9f21bd30477ffbf9ea81c0" exitCode=0 Oct 06 09:01:53 crc kubenswrapper[4610]: I1006 09:01:53.590663 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerDied","Data":"e75f73484809e0670748ddfa9ecabb283cf437fb8d9f21bd30477ffbf9ea81c0"} Oct 06 09:01:54 crc kubenswrapper[4610]: I1006 09:01:54.961612 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.030084 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.175257 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.168:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.624956 4610 generic.go:334] "Generic (PLEG): container finished" podID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerID="980b61421f9d5498d548880b365ac0717f97ce6a22c27c5f878661d138a2b3b7" exitCode=137 Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.625267 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="cinder-scheduler" containerID="cri-o://4c63c21000cf6c5148e61323072ed04670dad864343aa3332d1f91bea47ba69d" gracePeriod=30 Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.625541 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerDied","Data":"980b61421f9d5498d548880b365ac0717f97ce6a22c27c5f878661d138a2b3b7"} Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.625828 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="probe" containerID="cri-o://4e1d24942df3ea5781dea73d59cca7378ab9f3a7dc1034c32e38b1cf4597de46" gracePeriod=30 Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.678465 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:41378->10.217.0.164:9311: read: connection reset by peer" Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.678877 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:41398->10.217.0.164:9311: read: connection reset by peer" Oct 06 09:01:55 crc kubenswrapper[4610]: I1006 09:01:55.678930 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:41392->10.217.0.164:9311: read: connection reset by peer" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.040922 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c65b98c55-xjdpw"] Oct 06 09:01:56 crc kubenswrapper[4610]: E1006 09:01:56.041427 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="init" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.041446 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="init" Oct 06 09:01:56 crc kubenswrapper[4610]: E1006 09:01:56.041458 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="dnsmasq-dns" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.041467 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="dnsmasq-dns" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.041718 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="098bf04f-4224-4657-b2ab-828f7194a6ea" containerName="dnsmasq-dns" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.046345 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.048578 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.050123 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.058672 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.063599 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c65b98c55-xjdpw"] Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.232848 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6xw\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-kube-api-access-hh6xw\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233240 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-config-data\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233299 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-internal-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233349 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-etc-swift\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233418 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-log-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233452 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-public-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233496 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-run-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.233528 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-combined-ca-bundle\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335165 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6xw\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-kube-api-access-hh6xw\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335364 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-config-data\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335451 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-internal-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335524 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-etc-swift\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335672 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-log-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335719 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-public-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335782 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-run-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.335826 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-combined-ca-bundle\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.336162 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-log-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.336405 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d4cceaf-e744-49da-a634-84401f61d862-run-httpd\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.347343 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-internal-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.347373 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-combined-ca-bundle\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.348034 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-config-data\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.352691 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4cceaf-e744-49da-a634-84401f61d862-public-tls-certs\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.356542 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.360120 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6xw\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-kube-api-access-hh6xw\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.367768 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d4cceaf-e744-49da-a634-84401f61d862-etc-swift\") pod \"swift-proxy-c65b98c55-xjdpw\" (UID: \"2d4cceaf-e744-49da-a634-84401f61d862\") " pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.389471 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.638796 4610 generic.go:334] "Generic (PLEG): container finished" podID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerID="e54199d7ed64789b07edfaa8797f5c347ec8f27d3c16ad9fbbae7840a6239288" exitCode=0 Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.638837 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerDied","Data":"e54199d7ed64789b07edfaa8797f5c347ec8f27d3c16ad9fbbae7840a6239288"} Oct 06 09:01:56 crc kubenswrapper[4610]: I1006 09:01:56.915965 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8454b778cb-f7b67" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Oct 06 09:01:57 crc kubenswrapper[4610]: I1006 09:01:57.655381 4610 generic.go:334] "Generic (PLEG): container finished" podID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerID="4e1d24942df3ea5781dea73d59cca7378ab9f3a7dc1034c32e38b1cf4597de46" exitCode=0 Oct 06 09:01:57 crc kubenswrapper[4610]: I1006 09:01:57.655414 4610 generic.go:334] "Generic (PLEG): container finished" podID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerID="4c63c21000cf6c5148e61323072ed04670dad864343aa3332d1f91bea47ba69d" exitCode=0 Oct 06 09:01:57 crc kubenswrapper[4610]: I1006 09:01:57.655437 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerDied","Data":"4e1d24942df3ea5781dea73d59cca7378ab9f3a7dc1034c32e38b1cf4597de46"} Oct 06 09:01:57 crc kubenswrapper[4610]: I1006 09:01:57.655464 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerDied","Data":"4c63c21000cf6c5148e61323072ed04670dad864343aa3332d1f91bea47ba69d"} Oct 06 09:01:58 crc kubenswrapper[4610]: I1006 09:01:58.164519 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 09:01:58 crc kubenswrapper[4610]: I1006 09:01:58.936120 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Oct 06 09:01:58 crc kubenswrapper[4610]: I1006 09:01:58.936130 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b57c9f78-lbmlj" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.093261 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.093647 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-central-agent" containerID="cri-o://ad8ec97a6d2bf074ed199d3e6854a04afeb4a19c2fa8cc10c8a52d445dd3e700" gracePeriod=30 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.094127 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="proxy-httpd" containerID="cri-o://d0cff3adc5fbd03695f5bd42ce8e4739fcdd77785b619e0ef57573894281ee26" gracePeriod=30 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.094206 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="sg-core" containerID="cri-o://b90c2921ed9406e9fa55ee0930e4263fe37219332d1c6a739991283ec66ee627" gracePeriod=30 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.094299 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-notification-agent" containerID="cri-o://f717e5f26cf60271be30adf7d9f51a9a861ccf4f3f830ad4bd0246a9ed421fe6" gracePeriod=30 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691843 4610 generic.go:334] "Generic (PLEG): container finished" podID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerID="d0cff3adc5fbd03695f5bd42ce8e4739fcdd77785b619e0ef57573894281ee26" exitCode=0 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691883 4610 generic.go:334] "Generic (PLEG): container finished" podID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerID="b90c2921ed9406e9fa55ee0930e4263fe37219332d1c6a739991283ec66ee627" exitCode=2 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691893 4610 generic.go:334] "Generic (PLEG): container finished" podID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerID="ad8ec97a6d2bf074ed199d3e6854a04afeb4a19c2fa8cc10c8a52d445dd3e700" exitCode=0 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691924 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerDied","Data":"d0cff3adc5fbd03695f5bd42ce8e4739fcdd77785b619e0ef57573894281ee26"} Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691979 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerDied","Data":"b90c2921ed9406e9fa55ee0930e4263fe37219332d1c6a739991283ec66ee627"} Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.691992 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerDied","Data":"ad8ec97a6d2bf074ed199d3e6854a04afeb4a19c2fa8cc10c8a52d445dd3e700"} Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.705488 4610 generic.go:334] "Generic (PLEG): container finished" podID="96270269-2a63-42c6-a881-36aba28d88ae" containerID="caec19a4f32b7d63efc63516b9720214f2bd61b4e25caa1b10e7f9e209953778" exitCode=0 Oct 06 09:01:59 crc kubenswrapper[4610]: I1006 09:01:59.705529 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerDied","Data":"caec19a4f32b7d63efc63516b9720214f2bd61b4e25caa1b10e7f9e209953778"} Oct 06 09:02:00 crc kubenswrapper[4610]: I1006 09:02:00.722911 4610 generic.go:334] "Generic (PLEG): container finished" podID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerID="f717e5f26cf60271be30adf7d9f51a9a861ccf4f3f830ad4bd0246a9ed421fe6" exitCode=0 Oct 06 09:02:00 crc kubenswrapper[4610]: I1006 09:02:00.723244 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerDied","Data":"f717e5f26cf60271be30adf7d9f51a9a861ccf4f3f830ad4bd0246a9ed421fe6"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.372003 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.475442 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle\") pod \"ff84256d-299b-45c7-8b7a-454d64cd0244\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.475747 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs\") pod \"ff84256d-299b-45c7-8b7a-454d64cd0244\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.475779 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom\") pod \"ff84256d-299b-45c7-8b7a-454d64cd0244\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.475882 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsvnm\" (UniqueName: \"kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm\") pod \"ff84256d-299b-45c7-8b7a-454d64cd0244\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.475963 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data\") pod \"ff84256d-299b-45c7-8b7a-454d64cd0244\" (UID: \"ff84256d-299b-45c7-8b7a-454d64cd0244\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.476442 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs" (OuterVolumeSpecName: "logs") pod "ff84256d-299b-45c7-8b7a-454d64cd0244" (UID: "ff84256d-299b-45c7-8b7a-454d64cd0244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.482871 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm" (OuterVolumeSpecName: "kube-api-access-zsvnm") pod "ff84256d-299b-45c7-8b7a-454d64cd0244" (UID: "ff84256d-299b-45c7-8b7a-454d64cd0244"). InnerVolumeSpecName "kube-api-access-zsvnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.492573 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff84256d-299b-45c7-8b7a-454d64cd0244" (UID: "ff84256d-299b-45c7-8b7a-454d64cd0244"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.537523 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff84256d-299b-45c7-8b7a-454d64cd0244" (UID: "ff84256d-299b-45c7-8b7a-454d64cd0244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.586498 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsvnm\" (UniqueName: \"kubernetes.io/projected/ff84256d-299b-45c7-8b7a-454d64cd0244-kube-api-access-zsvnm\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.586543 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.586556 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff84256d-299b-45c7-8b7a-454d64cd0244-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.586568 4610 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.598161 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data" (OuterVolumeSpecName: "config-data") pod "ff84256d-299b-45c7-8b7a-454d64cd0244" (UID: "ff84256d-299b-45c7-8b7a-454d64cd0244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.646898 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.660029 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.660613 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.687929 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.687976 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle\") pod \"96270269-2a63-42c6-a881-36aba28d88ae\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.687996 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688036 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688082 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config\") pod \"96270269-2a63-42c6-a881-36aba28d88ae\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688107 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688132 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688182 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688207 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688236 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config\") pod \"96270269-2a63-42c6-a881-36aba28d88ae\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688254 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688285 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdmx6\" (UniqueName: \"kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688333 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr8jj\" (UniqueName: \"kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj\") pod \"96270269-2a63-42c6-a881-36aba28d88ae\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688364 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l26tb\" (UniqueName: \"kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688382 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data\") pod \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\" (UID: \"e9c8eb3d-3866-4f23-8ebc-0357571f26a6\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688426 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688451 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs\") pod \"96270269-2a63-42c6-a881-36aba28d88ae\" (UID: \"96270269-2a63-42c6-a881-36aba28d88ae\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688466 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data\") pod \"478966be-8437-4700-9ee1-6692e4ef7a1e\" (UID: \"478966be-8437-4700-9ee1-6692e4ef7a1e\") " Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.688807 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff84256d-299b-45c7-8b7a-454d64cd0244-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.705511 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.712884 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "96270269-2a63-42c6-a881-36aba28d88ae" (UID: "96270269-2a63-42c6-a881-36aba28d88ae"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.712880 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.715322 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs" (OuterVolumeSpecName: "logs") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.760598 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b7473c8-fdfd-426a-99da-57bc4175e303","Type":"ContainerStarted","Data":"8a6707c44675f744cc96e67c818d0d5a321237117808fb5ecd503cd4d9ae392f"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.762361 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6" (OuterVolumeSpecName: "kube-api-access-sdmx6") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "kube-api-access-sdmx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.764880 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts" (OuterVolumeSpecName: "scripts") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.765633 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj" (OuterVolumeSpecName: "kube-api-access-cr8jj") pod "96270269-2a63-42c6-a881-36aba28d88ae" (UID: "96270269-2a63-42c6-a881-36aba28d88ae"). InnerVolumeSpecName "kube-api-access-cr8jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.768537 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.771725 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb" (OuterVolumeSpecName: "kube-api-access-l26tb") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "kube-api-access-l26tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.787356 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.788598 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"478966be-8437-4700-9ee1-6692e4ef7a1e","Type":"ContainerDied","Data":"234be4e7e6193e6aa415e97b88ef92e7077b6711207bc62ca43c47c3674e35f6"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795103 4610 scope.go:117] "RemoveContainer" containerID="4e1d24942df3ea5781dea73d59cca7378ab9f3a7dc1034c32e38b1cf4597de46" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.791149 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795272 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795285 4610 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795295 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795304 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdmx6\" (UniqueName: \"kubernetes.io/projected/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-kube-api-access-sdmx6\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795315 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr8jj\" (UniqueName: \"kubernetes.io/projected/96270269-2a63-42c6-a881-36aba28d88ae-kube-api-access-cr8jj\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795325 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l26tb\" (UniqueName: \"kubernetes.io/projected/478966be-8437-4700-9ee1-6692e4ef7a1e-kube-api-access-l26tb\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795334 4610 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.795342 4610 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/478966be-8437-4700-9ee1-6692e4ef7a1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.798855 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.207094531 podStartE2EDuration="18.798834259s" podCreationTimestamp="2025-10-06 09:01:44 +0000 UTC" firstStartedPulling="2025-10-06 09:01:45.564421394 +0000 UTC m=+1237.279474782" lastFinishedPulling="2025-10-06 09:02:02.156161122 +0000 UTC m=+1253.871214510" observedRunningTime="2025-10-06 09:02:02.779966045 +0000 UTC m=+1254.495019433" watchObservedRunningTime="2025-10-06 09:02:02.798834259 +0000 UTC m=+1254.513887657" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.810297 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8454b778cb-f7b67" event={"ID":"e9c8eb3d-3866-4f23-8ebc-0357571f26a6","Type":"ContainerDied","Data":"28241f3730c57708bb8af3e326c0d9becd87ffa9455da4c95569af0581f8846d"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.810405 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8454b778cb-f7b67" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.849221 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-764bc97c84-flcb2" event={"ID":"96270269-2a63-42c6-a881-36aba28d88ae","Type":"ContainerDied","Data":"9be5b77c267c4a4369669e84ad16461209c2e84eec0f431e36a1108c65d75393"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.849323 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-764bc97c84-flcb2" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.850871 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data" (OuterVolumeSpecName: "config-data") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.853419 4610 scope.go:117] "RemoveContainer" containerID="4c63c21000cf6c5148e61323072ed04670dad864343aa3332d1f91bea47ba69d" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.855062 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.856388 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b57c9f78-lbmlj" event={"ID":"ff84256d-299b-45c7-8b7a-454d64cd0244","Type":"ContainerDied","Data":"57b26f036b30d9f0530fbbe1725c89405553740a907f9c5dabec8add3a78c9d3"} Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.856454 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b57c9f78-lbmlj" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.889964 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96270269-2a63-42c6-a881-36aba28d88ae" (UID: "96270269-2a63-42c6-a881-36aba28d88ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.897118 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.900845 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.901459 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:02 crc kubenswrapper[4610]: I1006 09:02:02.983633 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts" (OuterVolumeSpecName: "scripts") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.008175 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "96270269-2a63-42c6-a881-36aba28d88ae" (UID: "96270269-2a63-42c6-a881-36aba28d88ae"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.023196 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config" (OuterVolumeSpecName: "config") pod "96270269-2a63-42c6-a881-36aba28d88ae" (UID: "96270269-2a63-42c6-a881-36aba28d88ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.025400 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.035142 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.035173 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.035183 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.035193 4610 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96270269-2a63-42c6-a881-36aba28d88ae-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.039461 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e9c8eb3d-3866-4f23-8ebc-0357571f26a6" (UID: "e9c8eb3d-3866-4f23-8ebc-0357571f26a6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.107797 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data" (OuterVolumeSpecName: "config-data") pod "478966be-8437-4700-9ee1-6692e4ef7a1e" (UID: "478966be-8437-4700-9ee1-6692e4ef7a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.136736 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478966be-8437-4700-9ee1-6692e4ef7a1e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.136766 4610 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c8eb3d-3866-4f23-8ebc-0357571f26a6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.153357 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c65b98c55-xjdpw"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.156187 4610 scope.go:117] "RemoveContainer" containerID="44f2e86c3222c2dd45912193c4924309433a91696bc5355fe2a799b451e98b1f" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.186554 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238333 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238505 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r57zd\" (UniqueName: \"kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238641 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238728 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238751 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238787 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.238817 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle\") pod \"a68c0582-1bb2-4c8e-a81a-730d123ae768\" (UID: \"a68c0582-1bb2-4c8e-a81a-730d123ae768\") " Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.243455 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.248610 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.250425 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts" (OuterVolumeSpecName: "scripts") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.289091 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.292428 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd" (OuterVolumeSpecName: "kube-api-access-r57zd") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "kube-api-access-r57zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.309193 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-764bc97c84-flcb2"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.337218 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.344570 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.344598 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.344607 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a68c0582-1bb2-4c8e-a81a-730d123ae768-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.344616 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r57zd\" (UniqueName: \"kubernetes.io/projected/a68c0582-1bb2-4c8e-a81a-730d123ae768-kube-api-access-r57zd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.347332 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.353543 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.365113 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54b57c9f78-lbmlj"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.375153 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.400257 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8454b778cb-f7b67"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.405541 4610 scope.go:117] "RemoveContainer" containerID="980b61421f9d5498d548880b365ac0717f97ce6a22c27c5f878661d138a2b3b7" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.416159 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.416380 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-log" containerID="cri-o://58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab" gracePeriod=30 Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.416755 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-httpd" containerID="cri-o://a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a" gracePeriod=30 Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.445228 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.455341 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.455391 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.468318 4610 scope.go:117] "RemoveContainer" containerID="e75f73484809e0670748ddfa9ecabb283cf437fb8d9f21bd30477ffbf9ea81c0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.478913 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.488324 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data" (OuterVolumeSpecName: "config-data") pod "a68c0582-1bb2-4c8e-a81a-730d123ae768" (UID: "a68c0582-1bb2-4c8e-a81a-730d123ae768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.488389 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512474 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="proxy-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512512 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="proxy-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512537 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="sg-core" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512543 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="sg-core" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512561 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon-log" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512566 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon-log" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512574 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512581 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512595 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-api" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512602 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-api" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512613 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-notification-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512619 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-notification-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512636 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512642 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512678 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="probe" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512684 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="probe" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512704 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512710 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512732 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="cinder-scheduler" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512739 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="cinder-scheduler" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512748 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512754 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" Oct 06 09:02:03 crc kubenswrapper[4610]: E1006 09:02:03.512770 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-central-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.512776 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-central-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518369 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="sg-core" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518415 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="cinder-scheduler" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518445 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518469 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518479 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api-log" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518485 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" containerName="barbican-api" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518499 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" containerName="horizon-log" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518506 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" containerName="probe" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518527 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="proxy-httpd" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518544 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-central-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518562 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="96270269-2a63-42c6-a881-36aba28d88ae" containerName="neutron-api" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.518575 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" containerName="ceilometer-notification-agent" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.526085 4610 scope.go:117] "RemoveContainer" containerID="caec19a4f32b7d63efc63516b9720214f2bd61b4e25caa1b10e7f9e209953778" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.538256 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.538368 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.542250 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.563748 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68c0582-1bb2-4c8e-a81a-730d123ae768-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.606577 4610 scope.go:117] "RemoveContainer" containerID="e54199d7ed64789b07edfaa8797f5c347ec8f27d3c16ad9fbbae7840a6239288" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.640526 4610 scope.go:117] "RemoveContainer" containerID="6f2c41817a62375df35be86b9797fb05a7fd706c6e01cb42f5ef271d8320a9e6" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.664877 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.664932 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kdwp\" (UniqueName: \"kubernetes.io/projected/e29afc72-dbf0-453c-b96b-42d0399d6286-kube-api-access-9kdwp\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.664982 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.665013 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.665062 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.665118 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29afc72-dbf0-453c-b96b-42d0399d6286-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766215 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766289 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766325 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766382 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29afc72-dbf0-453c-b96b-42d0399d6286-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766476 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766516 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kdwp\" (UniqueName: \"kubernetes.io/projected/e29afc72-dbf0-453c-b96b-42d0399d6286-kube-api-access-9kdwp\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.766683 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29afc72-dbf0-453c-b96b-42d0399d6286-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.769855 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.770314 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.770520 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.771603 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29afc72-dbf0-453c-b96b-42d0399d6286-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.785445 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kdwp\" (UniqueName: \"kubernetes.io/projected/e29afc72-dbf0-453c-b96b-42d0399d6286-kube-api-access-9kdwp\") pod \"cinder-scheduler-0\" (UID: \"e29afc72-dbf0-453c-b96b-42d0399d6286\") " pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.876366 4610 generic.go:334] "Generic (PLEG): container finished" podID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerID="58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab" exitCode=143 Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.876471 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerDied","Data":"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab"} Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.881408 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c65b98c55-xjdpw" event={"ID":"2d4cceaf-e744-49da-a634-84401f61d862","Type":"ContainerStarted","Data":"ba6523514fa6007c861489bb72259c78e48ab34fd5188435f86288b022d23fca"} Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.881603 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c65b98c55-xjdpw" event={"ID":"2d4cceaf-e744-49da-a634-84401f61d862","Type":"ContainerStarted","Data":"07061ea1638aed9eabc2c91e4b8ca2956c06b6b3ca2ce8f5a6917765a28da523"} Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.881706 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c65b98c55-xjdpw" event={"ID":"2d4cceaf-e744-49da-a634-84401f61d862","Type":"ContainerStarted","Data":"aa660d482819a09aaf70dc7c4f116df255fff0cbcad66520d26c051e1ae47919"} Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.881826 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.881914 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.885601 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a68c0582-1bb2-4c8e-a81a-730d123ae768","Type":"ContainerDied","Data":"1ece93078a18809eb15b470284d8e04f45d37460e5c9c632b5db05c053ed3a1e"} Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.885770 4610 scope.go:117] "RemoveContainer" containerID="d0cff3adc5fbd03695f5bd42ce8e4739fcdd77785b619e0ef57573894281ee26" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.885676 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.893781 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 09:02:03 crc kubenswrapper[4610]: I1006 09:02:03.912572 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c65b98c55-xjdpw" podStartSLOduration=7.912550843 podStartE2EDuration="7.912550843s" podCreationTimestamp="2025-10-06 09:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:03.911594889 +0000 UTC m=+1255.626648277" watchObservedRunningTime="2025-10-06 09:02:03.912550843 +0000 UTC m=+1255.627604241" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.042816 4610 scope.go:117] "RemoveContainer" containerID="b90c2921ed9406e9fa55ee0930e4263fe37219332d1c6a739991283ec66ee627" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.048338 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.061228 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.077141 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.084771 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.090288 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.092398 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.092600 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.093165 4610 scope.go:117] "RemoveContainer" containerID="f717e5f26cf60271be30adf7d9f51a9a861ccf4f3f830ad4bd0246a9ed421fe6" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.141820 4610 scope.go:117] "RemoveContainer" containerID="ad8ec97a6d2bf074ed199d3e6854a04afeb4a19c2fa8cc10c8a52d445dd3e700" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.178708 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.178780 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.178857 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.178934 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.178990 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.179034 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.179145 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8st\" (UniqueName: \"kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280324 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280370 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280418 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280456 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280493 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280529 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.280564 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8st\" (UniqueName: \"kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.281083 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.281455 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.285797 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.286698 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.286706 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.298639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8st\" (UniqueName: \"kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.299474 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts\") pod \"ceilometer-0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.429878 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.447442 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.901088 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29afc72-dbf0-453c-b96b-42d0399d6286","Type":"ContainerStarted","Data":"1f7fbc5bc9117eabb9250b3b9c8072a280bd6015d0f46449822d7814ea93e8ff"} Oct 06 09:02:04 crc kubenswrapper[4610]: I1006 09:02:04.940688 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:04 crc kubenswrapper[4610]: W1006 09:02:04.952562 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb31ca9a3_5a7d_4908_b3c6_dbfccb29dcb0.slice/crio-e349406ee216442af3f54586376ab64f540b1d90f23404b11319fc2dd8223ca7 WatchSource:0}: Error finding container e349406ee216442af3f54586376ab64f540b1d90f23404b11319fc2dd8223ca7: Status 404 returned error can't find the container with id e349406ee216442af3f54586376ab64f540b1d90f23404b11319fc2dd8223ca7 Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.085205 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478966be-8437-4700-9ee1-6692e4ef7a1e" path="/var/lib/kubelet/pods/478966be-8437-4700-9ee1-6692e4ef7a1e/volumes" Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.085902 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96270269-2a63-42c6-a881-36aba28d88ae" path="/var/lib/kubelet/pods/96270269-2a63-42c6-a881-36aba28d88ae/volumes" Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.086557 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68c0582-1bb2-4c8e-a81a-730d123ae768" path="/var/lib/kubelet/pods/a68c0582-1bb2-4c8e-a81a-730d123ae768/volumes" Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.087891 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c8eb3d-3866-4f23-8ebc-0357571f26a6" path="/var/lib/kubelet/pods/e9c8eb3d-3866-4f23-8ebc-0357571f26a6/volumes" Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.088576 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff84256d-299b-45c7-8b7a-454d64cd0244" path="/var/lib/kubelet/pods/ff84256d-299b-45c7-8b7a-454d64cd0244/volumes" Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.922796 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerStarted","Data":"72862aadddbcba84fffd890d2a1f23d32c7e6435d0fc00c55cd6abe31052a470"} Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.923370 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerStarted","Data":"e349406ee216442af3f54586376ab64f540b1d90f23404b11319fc2dd8223ca7"} Oct 06 09:02:05 crc kubenswrapper[4610]: I1006 09:02:05.925860 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29afc72-dbf0-453c-b96b-42d0399d6286","Type":"ContainerStarted","Data":"e45358cee368dcdda894d5b9d56d408c969418b18af1c2dc21d21c15155e5658"} Oct 06 09:02:06 crc kubenswrapper[4610]: I1006 09:02:06.935954 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerStarted","Data":"9100fd10ef4b769f59565eeb109d0ff1873a51366c6faed72592b4f97436e9c3"} Oct 06 09:02:06 crc kubenswrapper[4610]: I1006 09:02:06.940975 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29afc72-dbf0-453c-b96b-42d0399d6286","Type":"ContainerStarted","Data":"491548addcb6bb6b7df60428ba82555c0c1ebdff095943410b7ad01e1a4039cb"} Oct 06 09:02:06 crc kubenswrapper[4610]: I1006 09:02:06.961961 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.961937153 podStartE2EDuration="3.961937153s" podCreationTimestamp="2025-10-06 09:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:06.961344208 +0000 UTC m=+1258.676397596" watchObservedRunningTime="2025-10-06 09:02:06.961937153 +0000 UTC m=+1258.676990561" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.561947 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676373 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676470 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676496 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676530 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndbv\" (UniqueName: \"kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676556 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676605 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676698 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.676729 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle\") pod \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\" (UID: \"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a\") " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.679171 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs" (OuterVolumeSpecName: "logs") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.686544 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.692904 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv" (OuterVolumeSpecName: "kube-api-access-8ndbv") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "kube-api-access-8ndbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.694130 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts" (OuterVolumeSpecName: "scripts") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.701298 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.768181 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.782324 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.782347 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.782355 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.782363 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ndbv\" (UniqueName: \"kubernetes.io/projected/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-kube-api-access-8ndbv\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.782385 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.799836 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.802175 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.821446 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.828172 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data" (OuterVolumeSpecName: "config-data") pod "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" (UID: "cbf0caa2-fa42-437c-8a1a-c691b35f5d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.903573 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.903610 4610 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.903622 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.968602 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerStarted","Data":"234da430ca6e89c725bafcccb643a5b0808325202b9c20beb9736e7c2db2b93e"} Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.976873 4610 generic.go:334] "Generic (PLEG): container finished" podID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerID="a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a" exitCode=0 Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.976993 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.977754 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerDied","Data":"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a"} Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.977780 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cbf0caa2-fa42-437c-8a1a-c691b35f5d3a","Type":"ContainerDied","Data":"0734404a66405d1de3b67fe32832da53c8050c4a463aff4112ee0756462fd513"} Oct 06 09:02:07 crc kubenswrapper[4610]: I1006 09:02:07.977797 4610 scope.go:117] "RemoveContainer" containerID="a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.013742 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.033216 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.035307 4610 scope.go:117] "RemoveContainer" containerID="58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.047847 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:08 crc kubenswrapper[4610]: E1006 09:02:08.048333 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-httpd" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.048360 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-httpd" Oct 06 09:02:08 crc kubenswrapper[4610]: E1006 09:02:08.048406 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-log" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.048416 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-log" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.049187 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-log" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.049243 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" containerName="glance-httpd" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.050600 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.054385 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.055082 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.067935 4610 scope.go:117] "RemoveContainer" containerID="a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a" Oct 06 09:02:08 crc kubenswrapper[4610]: E1006 09:02:08.070488 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a\": container with ID starting with a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a not found: ID does not exist" containerID="a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.070521 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a"} err="failed to get container status \"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a\": rpc error: code = NotFound desc = could not find container \"a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a\": container with ID starting with a9bdadbd27f652abb5b212d3f1b602d118bf38a8901d1d56d3dac83da98eaf9a not found: ID does not exist" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.070542 4610 scope.go:117] "RemoveContainer" containerID="58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab" Oct 06 09:02:08 crc kubenswrapper[4610]: E1006 09:02:08.071943 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab\": container with ID starting with 58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab not found: ID does not exist" containerID="58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.071972 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab"} err="failed to get container status \"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab\": rpc error: code = NotFound desc = could not find container \"58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab\": container with ID starting with 58323e3f0417e988a74483cb8594fcc1079b33fb794848b17762e508480e66ab not found: ID does not exist" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.072209 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109234 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109297 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109347 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109398 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkq7b\" (UniqueName: \"kubernetes.io/projected/ba3a861e-9618-4947-9e23-c285ec4d43a6-kube-api-access-mkq7b\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109544 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109634 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109661 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.109680 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212009 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212087 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212124 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212151 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212181 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212205 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.212237 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkq7b\" (UniqueName: \"kubernetes.io/projected/ba3a861e-9618-4947-9e23-c285ec4d43a6-kube-api-access-mkq7b\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.213106 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.213462 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.219476 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba3a861e-9618-4947-9e23-c285ec4d43a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.219956 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.220110 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.223082 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.238129 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba3a861e-9618-4947-9e23-c285ec4d43a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.252735 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkq7b\" (UniqueName: \"kubernetes.io/projected/ba3a861e-9618-4947-9e23-c285ec4d43a6-kube-api-access-mkq7b\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.272981 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba3a861e-9618-4947-9e23-c285ec4d43a6\") " pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.371996 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.706440 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:08 crc kubenswrapper[4610]: I1006 09:02:08.894299 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.028256 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.091415 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf0caa2-fa42-437c-8a1a-c691b35f5d3a" path="/var/lib/kubelet/pods/cbf0caa2-fa42-437c-8a1a-c691b35f5d3a/volumes" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.760129 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dd97s"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.761958 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.782662 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dd97s"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.864856 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qrd6k"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.866977 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.880374 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qrd6k"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.901902 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcz4s\" (UniqueName: \"kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s\") pod \"nova-api-db-create-dd97s\" (UID: \"1bf5de1e-fa0f-47d2-a549-35836fecffa8\") " pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.957863 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-q6xqb"] Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.958991 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:09 crc kubenswrapper[4610]: I1006 09:02:09.983961 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q6xqb"] Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.005408 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcz4s\" (UniqueName: \"kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s\") pod \"nova-api-db-create-dd97s\" (UID: \"1bf5de1e-fa0f-47d2-a549-35836fecffa8\") " pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.005539 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszbl\" (UniqueName: \"kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl\") pod \"nova-cell0-db-create-qrd6k\" (UID: \"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55\") " pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.018723 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba3a861e-9618-4947-9e23-c285ec4d43a6","Type":"ContainerStarted","Data":"2664eac0721215c2c1a2d9affc5f50f3e12aebbf86836242458ebc9f87c6d6d4"} Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.018767 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba3a861e-9618-4947-9e23-c285ec4d43a6","Type":"ContainerStarted","Data":"174939aec97c7f2db0d7f8f66264d0f5287f97a096f47c40e4a258386c4bafcd"} Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.039569 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerStarted","Data":"736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b"} Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.039767 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-central-agent" containerID="cri-o://72862aadddbcba84fffd890d2a1f23d32c7e6435d0fc00c55cd6abe31052a470" gracePeriod=30 Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.040102 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.040423 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="proxy-httpd" containerID="cri-o://736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b" gracePeriod=30 Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.040482 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="sg-core" containerID="cri-o://234da430ca6e89c725bafcccb643a5b0808325202b9c20beb9736e7c2db2b93e" gracePeriod=30 Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.040530 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-notification-agent" containerID="cri-o://9100fd10ef4b769f59565eeb109d0ff1873a51366c6faed72592b4f97436e9c3" gracePeriod=30 Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.044475 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcz4s\" (UniqueName: \"kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s\") pod \"nova-api-db-create-dd97s\" (UID: \"1bf5de1e-fa0f-47d2-a549-35836fecffa8\") " pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.070685 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.177486562 podStartE2EDuration="6.070668284s" podCreationTimestamp="2025-10-06 09:02:04 +0000 UTC" firstStartedPulling="2025-10-06 09:02:04.954236607 +0000 UTC m=+1256.669289995" lastFinishedPulling="2025-10-06 09:02:08.847418329 +0000 UTC m=+1260.562471717" observedRunningTime="2025-10-06 09:02:10.06931312 +0000 UTC m=+1261.784366508" watchObservedRunningTime="2025-10-06 09:02:10.070668284 +0000 UTC m=+1261.785721672" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.095006 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.106910 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvtx\" (UniqueName: \"kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx\") pod \"nova-cell1-db-create-q6xqb\" (UID: \"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69\") " pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.107163 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszbl\" (UniqueName: \"kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl\") pod \"nova-cell0-db-create-qrd6k\" (UID: \"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55\") " pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.125770 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszbl\" (UniqueName: \"kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl\") pod \"nova-cell0-db-create-qrd6k\" (UID: \"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55\") " pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.190931 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.209776 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvtx\" (UniqueName: \"kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx\") pod \"nova-cell1-db-create-q6xqb\" (UID: \"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69\") " pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.236991 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvtx\" (UniqueName: \"kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx\") pod \"nova-cell1-db-create-q6xqb\" (UID: \"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69\") " pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.297302 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.658662 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dd97s"] Oct 06 09:02:10 crc kubenswrapper[4610]: I1006 09:02:10.910314 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q6xqb"] Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.016198 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qrd6k"] Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.058816 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba3a861e-9618-4947-9e23-c285ec4d43a6","Type":"ContainerStarted","Data":"6842c847564c5c2fa50a0d7a52095f8a50f0017283e25b1fe38b957b8cfc1ee6"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.063841 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrd6k" event={"ID":"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55","Type":"ContainerStarted","Data":"9b4cef67a54732bbbf696fb81d01b48783847821e8cf53d8309713012cbb569e"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.081749 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.081734378 podStartE2EDuration="3.081734378s" podCreationTimestamp="2025-10-06 09:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:11.079753078 +0000 UTC m=+1262.794806466" watchObservedRunningTime="2025-10-06 09:02:11.081734378 +0000 UTC m=+1262.796787756" Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.102578 4610 generic.go:334] "Generic (PLEG): container finished" podID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerID="736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b" exitCode=0 Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.102613 4610 generic.go:334] "Generic (PLEG): container finished" podID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerID="234da430ca6e89c725bafcccb643a5b0808325202b9c20beb9736e7c2db2b93e" exitCode=2 Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.102631 4610 generic.go:334] "Generic (PLEG): container finished" podID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerID="9100fd10ef4b769f59565eeb109d0ff1873a51366c6faed72592b4f97436e9c3" exitCode=0 Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.113431 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dd97s" event={"ID":"1bf5de1e-fa0f-47d2-a549-35836fecffa8","Type":"ContainerStarted","Data":"64f5faa784e65946415f338abd4b1397d4da9cf7dad60ca888d63dad3f4fc172"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.113480 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerDied","Data":"736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.113502 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerDied","Data":"234da430ca6e89c725bafcccb643a5b0808325202b9c20beb9736e7c2db2b93e"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.113512 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerDied","Data":"9100fd10ef4b769f59565eeb109d0ff1873a51366c6faed72592b4f97436e9c3"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.113521 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q6xqb" event={"ID":"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69","Type":"ContainerStarted","Data":"02637366bd989096a8c2e3abbdfb6ae13be5f6f95e7033da79ed4eebde1d4c71"} Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.399590 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:02:11 crc kubenswrapper[4610]: I1006 09:02:11.407589 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c65b98c55-xjdpw" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.118730 4610 generic.go:334] "Generic (PLEG): container finished" podID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerID="72862aadddbcba84fffd890d2a1f23d32c7e6435d0fc00c55cd6abe31052a470" exitCode=0 Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.118818 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerDied","Data":"72862aadddbcba84fffd890d2a1f23d32c7e6435d0fc00c55cd6abe31052a470"} Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.121924 4610 generic.go:334] "Generic (PLEG): container finished" podID="9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" containerID="23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e" exitCode=0 Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.121988 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q6xqb" event={"ID":"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69","Type":"ContainerDied","Data":"23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e"} Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.124967 4610 generic.go:334] "Generic (PLEG): container finished" podID="1bf5de1e-fa0f-47d2-a549-35836fecffa8" containerID="5c39e5162ee2d6e80f8972919d2e53019c1bed87cb41ad4215c3375c93e54f48" exitCode=0 Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.125025 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dd97s" event={"ID":"1bf5de1e-fa0f-47d2-a549-35836fecffa8","Type":"ContainerDied","Data":"5c39e5162ee2d6e80f8972919d2e53019c1bed87cb41ad4215c3375c93e54f48"} Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.127298 4610 generic.go:334] "Generic (PLEG): container finished" podID="bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" containerID="275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971" exitCode=0 Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.127944 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrd6k" event={"ID":"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55","Type":"ContainerDied","Data":"275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971"} Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.250407 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348075 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn8st\" (UniqueName: \"kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348193 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348233 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348254 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348286 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348311 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.348342 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml\") pod \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\" (UID: \"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0\") " Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.349009 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.349293 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.350498 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.353805 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts" (OuterVolumeSpecName: "scripts") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.357166 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st" (OuterVolumeSpecName: "kube-api-access-jn8st") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "kube-api-access-jn8st". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.385275 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.450957 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.450992 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.451003 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.451015 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn8st\" (UniqueName: \"kubernetes.io/projected/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-kube-api-access-jn8st\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.453802 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.487600 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data" (OuterVolumeSpecName: "config-data") pod "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" (UID: "b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.552630 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:12 crc kubenswrapper[4610]: I1006 09:02:12.552664 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.136651 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0","Type":"ContainerDied","Data":"e349406ee216442af3f54586376ab64f540b1d90f23404b11319fc2dd8223ca7"} Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.136715 4610 scope.go:117] "RemoveContainer" containerID="736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.137557 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.163395 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.181423 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.191390 4610 scope.go:117] "RemoveContainer" containerID="234da430ca6e89c725bafcccb643a5b0808325202b9c20beb9736e7c2db2b93e" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.202726 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:13 crc kubenswrapper[4610]: E1006 09:02:13.203121 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="proxy-httpd" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203135 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="proxy-httpd" Oct 06 09:02:13 crc kubenswrapper[4610]: E1006 09:02:13.203156 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-central-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203162 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-central-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: E1006 09:02:13.203180 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="sg-core" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203186 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="sg-core" Oct 06 09:02:13 crc kubenswrapper[4610]: E1006 09:02:13.203205 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-notification-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203211 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-notification-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203402 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-notification-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203418 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="proxy-httpd" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203433 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="sg-core" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.203448 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" containerName="ceilometer-central-agent" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.205346 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.208558 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.208754 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.210469 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.252405 4610 scope.go:117] "RemoveContainer" containerID="9100fd10ef4b769f59565eeb109d0ff1873a51366c6faed72592b4f97436e9c3" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280216 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280276 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r9w7\" (UniqueName: \"kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280303 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280333 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280373 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280396 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.280564 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.295369 4610 scope.go:117] "RemoveContainer" containerID="72862aadddbcba84fffd890d2a1f23d32c7e6435d0fc00c55cd6abe31052a470" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.381890 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.381943 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.382010 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.382058 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r9w7\" (UniqueName: \"kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.382088 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.382112 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.382150 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.384543 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.385532 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.386505 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.402450 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.402674 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.402792 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.415167 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r9w7\" (UniqueName: \"kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7\") pod \"ceilometer-0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.534397 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:13 crc kubenswrapper[4610]: W1006 09:02:13.537232 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb31ca9a3_5a7d_4908_b3c6_dbfccb29dcb0.slice/crio-736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b.scope WatchSource:0}: Error finding container 736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b: Status 404 returned error can't find the container with id 736f7dc034fee7c74c7901b81d05b43b90bd7ee2a4f6cc248ea22f3fb25ec95b Oct 06 09:02:13 crc kubenswrapper[4610]: W1006 09:02:13.537875 4610 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd27a1b_1f42_4cb0_9d18_8fb1405e6a69.slice/crio-conmon-23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd27a1b_1f42_4cb0_9d18_8fb1405e6a69.slice/crio-conmon-23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e.scope: no such file or directory Oct 06 09:02:13 crc kubenswrapper[4610]: W1006 09:02:13.537931 4610 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf183c7_2aaa_4ac6_9aae_ce3f12f19e55.slice/crio-conmon-275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf183c7_2aaa_4ac6_9aae_ce3f12f19e55.slice/crio-conmon-275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971.scope: no such file or directory Oct 06 09:02:13 crc kubenswrapper[4610]: W1006 09:02:13.537948 4610 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd27a1b_1f42_4cb0_9d18_8fb1405e6a69.slice/crio-23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd27a1b_1f42_4cb0_9d18_8fb1405e6a69.slice/crio-23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e.scope: no such file or directory Oct 06 09:02:13 crc kubenswrapper[4610]: W1006 09:02:13.537963 4610 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf183c7_2aaa_4ac6_9aae_ce3f12f19e55.slice/crio-275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf183c7_2aaa_4ac6_9aae_ce3f12f19e55.slice/crio-275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971.scope: no such file or directory Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.665590 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.672857 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.721099 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.791333 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcz4s\" (UniqueName: \"kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s\") pod \"1bf5de1e-fa0f-47d2-a549-35836fecffa8\" (UID: \"1bf5de1e-fa0f-47d2-a549-35836fecffa8\") " Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.791465 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvtx\" (UniqueName: \"kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx\") pod \"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69\" (UID: \"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69\") " Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.791552 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zszbl\" (UniqueName: \"kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl\") pod \"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55\" (UID: \"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55\") " Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.798287 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl" (OuterVolumeSpecName: "kube-api-access-zszbl") pod "bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" (UID: "bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55"). InnerVolumeSpecName "kube-api-access-zszbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.798358 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s" (OuterVolumeSpecName: "kube-api-access-mcz4s") pod "1bf5de1e-fa0f-47d2-a549-35836fecffa8" (UID: "1bf5de1e-fa0f-47d2-a549-35836fecffa8"). InnerVolumeSpecName "kube-api-access-mcz4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.855235 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx" (OuterVolumeSpecName: "kube-api-access-4gvtx") pod "9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" (UID: "9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69"). InnerVolumeSpecName "kube-api-access-4gvtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.893274 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcz4s\" (UniqueName: \"kubernetes.io/projected/1bf5de1e-fa0f-47d2-a549-35836fecffa8-kube-api-access-mcz4s\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.893303 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvtx\" (UniqueName: \"kubernetes.io/projected/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69-kube-api-access-4gvtx\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.893313 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zszbl\" (UniqueName: \"kubernetes.io/projected/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55-kube-api-access-zszbl\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:13 crc kubenswrapper[4610]: I1006 09:02:13.997751 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.055981 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.087947 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.088256 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-log" containerID="cri-o://0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4" gracePeriod=30 Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.088445 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-httpd" containerID="cri-o://babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e" gracePeriod=30 Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.170275 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dd97s" event={"ID":"1bf5de1e-fa0f-47d2-a549-35836fecffa8","Type":"ContainerDied","Data":"64f5faa784e65946415f338abd4b1397d4da9cf7dad60ca888d63dad3f4fc172"} Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.170474 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64f5faa784e65946415f338abd4b1397d4da9cf7dad60ca888d63dad3f4fc172" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.170533 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dd97s" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.191158 4610 generic.go:334] "Generic (PLEG): container finished" podID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerID="2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7" exitCode=137 Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.191374 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerDied","Data":"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7"} Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.191450 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5490ccd6-2c49-47b6-a3e0-9d068b0080b4","Type":"ContainerDied","Data":"f58ec056b7721a38ec1a755b4261df5a78540fb1cc7c5d980093503ba26097fe"} Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.191510 4610 scope.go:117] "RemoveContainer" containerID="2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.191603 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.206937 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrd6k" event={"ID":"bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55","Type":"ContainerDied","Data":"9b4cef67a54732bbbf696fb81d01b48783847821e8cf53d8309713012cbb569e"} Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.207098 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b4cef67a54732bbbf696fb81d01b48783847821e8cf53d8309713012cbb569e" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.207209 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrd6k" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217151 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217242 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217281 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217340 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217412 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217458 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.217515 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf5lr\" (UniqueName: \"kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr\") pod \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\" (UID: \"5490ccd6-2c49-47b6-a3e0-9d068b0080b4\") " Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.219500 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.219729 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs" (OuterVolumeSpecName: "logs") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.246962 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts" (OuterVolumeSpecName: "scripts") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.252387 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q6xqb" event={"ID":"9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69","Type":"ContainerDied","Data":"02637366bd989096a8c2e3abbdfb6ae13be5f6f95e7033da79ed4eebde1d4c71"} Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.252442 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02637366bd989096a8c2e3abbdfb6ae13be5f6f95e7033da79ed4eebde1d4c71" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.252510 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q6xqb" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.262823 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: W1006 09:02:14.267205 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0506c59_024d_4c05_aae8_84e6c01075a0.slice/crio-19156b1357ea6807826d254c3cb4cf0d455023e849c16657b30e5d9d908cf829 WatchSource:0}: Error finding container 19156b1357ea6807826d254c3cb4cf0d455023e849c16657b30e5d9d908cf829: Status 404 returned error can't find the container with id 19156b1357ea6807826d254c3cb4cf0d455023e849c16657b30e5d9d908cf829 Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.308266 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.308393 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr" (OuterVolumeSpecName: "kube-api-access-nf5lr") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "kube-api-access-nf5lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.319417 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.319447 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf5lr\" (UniqueName: \"kubernetes.io/projected/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-kube-api-access-nf5lr\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.319458 4610 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.319467 4610 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.319478 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.340219 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.340423 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.374873 4610 scope.go:117] "RemoveContainer" containerID="57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.416949 4610 scope.go:117] "RemoveContainer" containerID="2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.417385 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7\": container with ID starting with 2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7 not found: ID does not exist" containerID="2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.417412 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7"} err="failed to get container status \"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7\": rpc error: code = NotFound desc = could not find container \"2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7\": container with ID starting with 2ee3588f9ac05dc6935eaa722c4c226e7257c05bc956107c5ac7a87013da8fc7 not found: ID does not exist" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.417430 4610 scope.go:117] "RemoveContainer" containerID="57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.423826 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5\": container with ID starting with 57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5 not found: ID does not exist" containerID="57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.423871 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5"} err="failed to get container status \"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5\": rpc error: code = NotFound desc = could not find container \"57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5\": container with ID starting with 57724d446f70173d10acff333cbce974a884ac8a795bbc74df29576e7593eef5 not found: ID does not exist" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.424989 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.441193 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data" (OuterVolumeSpecName: "config-data") pod "5490ccd6-2c49-47b6-a3e0-9d068b0080b4" (UID: "5490ccd6-2c49-47b6-a3e0-9d068b0080b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.527482 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5490ccd6-2c49-47b6-a3e0-9d068b0080b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.530136 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.538734 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550359 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.550711 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550725 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.550749 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api-log" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550756 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api-log" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.550766 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550772 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.550789 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550795 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: E1006 09:02:14.550814 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf5de1e-fa0f-47d2-a549-35836fecffa8" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550821 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf5de1e-fa0f-47d2-a549-35836fecffa8" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.550988 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.551002 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" containerName="cinder-api-log" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.551010 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.551020 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.551032 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf5de1e-fa0f-47d2-a549-35836fecffa8" containerName="mariadb-database-create" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.552333 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.555731 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.555795 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.558892 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.562706 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629239 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629580 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrww\" (UniqueName: \"kubernetes.io/projected/279bb64b-8fba-4afc-9ded-6bd2375521ba-kube-api-access-wkrww\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629708 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-scripts\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629754 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/279bb64b-8fba-4afc-9ded-6bd2375521ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629790 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629817 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.629955 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279bb64b-8fba-4afc-9ded-6bd2375521ba-logs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.630030 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.630074 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731527 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-scripts\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731586 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/279bb64b-8fba-4afc-9ded-6bd2375521ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731628 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731657 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279bb64b-8fba-4afc-9ded-6bd2375521ba-logs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731715 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731738 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731798 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.731832 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrww\" (UniqueName: \"kubernetes.io/projected/279bb64b-8fba-4afc-9ded-6bd2375521ba-kube-api-access-wkrww\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.732323 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/279bb64b-8fba-4afc-9ded-6bd2375521ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.737217 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.737473 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279bb64b-8fba-4afc-9ded-6bd2375521ba-logs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.738613 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.738968 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-scripts\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.739725 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.744705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.748726 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279bb64b-8fba-4afc-9ded-6bd2375521ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.757664 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrww\" (UniqueName: \"kubernetes.io/projected/279bb64b-8fba-4afc-9ded-6bd2375521ba-kube-api-access-wkrww\") pod \"cinder-api-0\" (UID: \"279bb64b-8fba-4afc-9ded-6bd2375521ba\") " pod="openstack/cinder-api-0" Oct 06 09:02:14 crc kubenswrapper[4610]: I1006 09:02:14.872191 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.081108 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5490ccd6-2c49-47b6-a3e0-9d068b0080b4" path="/var/lib/kubelet/pods/5490ccd6-2c49-47b6-a3e0-9d068b0080b4/volumes" Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.082078 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0" path="/var/lib/kubelet/pods/b31ca9a3-5a7d-4908-b3c6-dbfccb29dcb0/volumes" Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.266970 4610 generic.go:334] "Generic (PLEG): container finished" podID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerID="0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4" exitCode=143 Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.267028 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerDied","Data":"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4"} Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.268219 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerStarted","Data":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.268243 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerStarted","Data":"19156b1357ea6807826d254c3cb4cf0d455023e849c16657b30e5d9d908cf829"} Oct 06 09:02:15 crc kubenswrapper[4610]: W1006 09:02:15.399690 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279bb64b_8fba_4afc_9ded_6bd2375521ba.slice/crio-05ef2a7ae2c5814201dd44091f255ccb0222f41c695f100c9d945f60e35b6378 WatchSource:0}: Error finding container 05ef2a7ae2c5814201dd44091f255ccb0222f41c695f100c9d945f60e35b6378: Status 404 returned error can't find the container with id 05ef2a7ae2c5814201dd44091f255ccb0222f41c695f100c9d945f60e35b6378 Oct 06 09:02:15 crc kubenswrapper[4610]: I1006 09:02:15.404133 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 09:02:16 crc kubenswrapper[4610]: I1006 09:02:16.289418 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"279bb64b-8fba-4afc-9ded-6bd2375521ba","Type":"ContainerStarted","Data":"bfdb75604b1421031a7c42d659c928fcf8e630096133e3604914f9d9facbc430"} Oct 06 09:02:16 crc kubenswrapper[4610]: I1006 09:02:16.289912 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"279bb64b-8fba-4afc-9ded-6bd2375521ba","Type":"ContainerStarted","Data":"05ef2a7ae2c5814201dd44091f255ccb0222f41c695f100c9d945f60e35b6378"} Oct 06 09:02:16 crc kubenswrapper[4610]: I1006 09:02:16.291514 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerStarted","Data":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} Oct 06 09:02:17 crc kubenswrapper[4610]: I1006 09:02:17.300730 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerStarted","Data":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} Oct 06 09:02:17 crc kubenswrapper[4610]: I1006 09:02:17.303382 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"279bb64b-8fba-4afc-9ded-6bd2375521ba","Type":"ContainerStarted","Data":"1a10e202f206066a5eccbdb0c6a2fea566eaea71adeb1ed114915e5c7cd911af"} Oct 06 09:02:17 crc kubenswrapper[4610]: I1006 09:02:17.304398 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.287207 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.310086 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.310067559 podStartE2EDuration="4.310067559s" podCreationTimestamp="2025-10-06 09:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:17.327143052 +0000 UTC m=+1269.042196450" watchObservedRunningTime="2025-10-06 09:02:18.310067559 +0000 UTC m=+1270.025120947" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315163 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerStarted","Data":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315330 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-central-agent" containerID="cri-o://47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" gracePeriod=30 Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315389 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315456 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="proxy-httpd" containerID="cri-o://b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" gracePeriod=30 Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315498 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="sg-core" containerID="cri-o://6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" gracePeriod=30 Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.315530 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-notification-agent" containerID="cri-o://32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" gracePeriod=30 Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.323247 4610 generic.go:334] "Generic (PLEG): container finished" podID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerID="babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e" exitCode=0 Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.324125 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.324320 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerDied","Data":"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e"} Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.324352 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a","Type":"ContainerDied","Data":"148517bf2ae786a03b261dccf2f97a4bbf3907b3ee5d082ee9c11ef8806c6811"} Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.324371 4610 scope.go:117] "RemoveContainer" containerID="babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.343202 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.944986127 podStartE2EDuration="5.343185601s" podCreationTimestamp="2025-10-06 09:02:13 +0000 UTC" firstStartedPulling="2025-10-06 09:02:14.287880677 +0000 UTC m=+1266.002934065" lastFinishedPulling="2025-10-06 09:02:17.686080151 +0000 UTC m=+1269.401133539" observedRunningTime="2025-10-06 09:02:18.339370035 +0000 UTC m=+1270.054423423" watchObservedRunningTime="2025-10-06 09:02:18.343185601 +0000 UTC m=+1270.058238989" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.372690 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.372730 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.397038 4610 scope.go:117] "RemoveContainer" containerID="0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.399897 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.399956 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sn5v\" (UniqueName: \"kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.399978 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.400009 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.400025 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.400115 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.400130 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.400188 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run\") pod \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\" (UID: \"25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a\") " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.402117 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.405771 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs" (OuterVolumeSpecName: "logs") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.411182 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.416190 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts" (OuterVolumeSpecName: "scripts") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.416431 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.461254 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v" (OuterVolumeSpecName: "kube-api-access-9sn5v") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "kube-api-access-9sn5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.467309 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.503995 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.504344 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.504423 4610 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.504475 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sn5v\" (UniqueName: \"kubernetes.io/projected/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-kube-api-access-9sn5v\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.504522 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.513368 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.525686 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data" (OuterVolumeSpecName: "config-data") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.544490 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.553532 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" (UID: "25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.563426 4610 scope.go:117] "RemoveContainer" containerID="babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e" Oct 06 09:02:18 crc kubenswrapper[4610]: E1006 09:02:18.563850 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e\": container with ID starting with babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e not found: ID does not exist" containerID="babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.563891 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e"} err="failed to get container status \"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e\": rpc error: code = NotFound desc = could not find container \"babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e\": container with ID starting with babc037adbd5d37a4e6d751eeae8a8b3a1d9280d7f44f0ac13474aa69b548e8e not found: ID does not exist" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.563918 4610 scope.go:117] "RemoveContainer" containerID="0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4" Oct 06 09:02:18 crc kubenswrapper[4610]: E1006 09:02:18.564767 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4\": container with ID starting with 0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4 not found: ID does not exist" containerID="0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.564844 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4"} err="failed to get container status \"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4\": rpc error: code = NotFound desc = could not find container \"0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4\": container with ID starting with 0284172d742de38d91d3c5416591d0ad3d7850487ff216cfc0d506d0d78bcaa4 not found: ID does not exist" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.606174 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.606203 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.606213 4610 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.606221 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.657543 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.678725 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.687613 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:18 crc kubenswrapper[4610]: E1006 09:02:18.688101 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-httpd" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.688132 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-httpd" Oct 06 09:02:18 crc kubenswrapper[4610]: E1006 09:02:18.688163 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-log" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.688174 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-log" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.688401 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-log" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.688435 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" containerName="glance-httpd" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.689578 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.691657 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.691852 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.703798 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817212 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817300 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817382 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817404 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-logs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817437 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817459 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-config-data\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817500 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9jn\" (UniqueName: \"kubernetes.io/projected/225e171a-3dd8-4d73-af22-fa01ef4a7359-kube-api-access-nr9jn\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.817524 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-scripts\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919276 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919384 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919411 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-logs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919444 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919470 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-config-data\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919523 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9jn\" (UniqueName: \"kubernetes.io/projected/225e171a-3dd8-4d73-af22-fa01ef4a7359-kube-api-access-nr9jn\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919579 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-scripts\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.919599 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.920698 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.921010 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.921808 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/225e171a-3dd8-4d73-af22-fa01ef4a7359-logs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.925178 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.925698 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.927204 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-scripts\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.928487 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/225e171a-3dd8-4d73-af22-fa01ef4a7359-config-data\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.972172 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9jn\" (UniqueName: \"kubernetes.io/projected/225e171a-3dd8-4d73-af22-fa01ef4a7359-kube-api-access-nr9jn\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:18 crc kubenswrapper[4610]: I1006 09:02:18.983010 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"225e171a-3dd8-4d73-af22-fa01ef4a7359\") " pod="openstack/glance-default-external-api-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.028377 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.084764 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a" path="/var/lib/kubelet/pods/25ae5dad-7e0d-4fb7-a733-5a2f5bb4338a/volumes" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.170738 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.226714 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.226776 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.226861 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.226935 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.226992 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r9w7\" (UniqueName: \"kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.227011 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.227037 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data\") pod \"f0506c59-024d-4c05-aae8-84e6c01075a0\" (UID: \"f0506c59-024d-4c05-aae8-84e6c01075a0\") " Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.229332 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.229596 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.234266 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7" (OuterVolumeSpecName: "kube-api-access-5r9w7") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "kube-api-access-5r9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.274289 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts" (OuterVolumeSpecName: "scripts") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.286458 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.353368 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.353425 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0506c59-024d-4c05-aae8-84e6c01075a0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.353438 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.353455 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r9w7\" (UniqueName: \"kubernetes.io/projected/f0506c59-024d-4c05-aae8-84e6c01075a0-kube-api-access-5r9w7\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.353475 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.359834 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.379627 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.379928 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerDied","Data":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382339 4610 scope.go:117] "RemoveContainer" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.380120 4610 generic.go:334] "Generic (PLEG): container finished" podID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" exitCode=0 Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382473 4610 generic.go:334] "Generic (PLEG): container finished" podID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" exitCode=2 Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382489 4610 generic.go:334] "Generic (PLEG): container finished" podID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" exitCode=0 Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382506 4610 generic.go:334] "Generic (PLEG): container finished" podID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" exitCode=0 Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerDied","Data":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382570 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerDied","Data":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382582 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerDied","Data":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.382594 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0506c59-024d-4c05-aae8-84e6c01075a0","Type":"ContainerDied","Data":"19156b1357ea6807826d254c3cb4cf0d455023e849c16657b30e5d9d908cf829"} Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.397923 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.399087 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.428799 4610 scope.go:117] "RemoveContainer" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.455233 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.463317 4610 scope.go:117] "RemoveContainer" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.472188 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data" (OuterVolumeSpecName: "config-data") pod "f0506c59-024d-4c05-aae8-84e6c01075a0" (UID: "f0506c59-024d-4c05-aae8-84e6c01075a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.547241 4610 scope.go:117] "RemoveContainer" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.557097 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0506c59-024d-4c05-aae8-84e6c01075a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.577076 4610 scope.go:117] "RemoveContainer" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.577536 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": container with ID starting with b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d not found: ID does not exist" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.577575 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} err="failed to get container status \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": rpc error: code = NotFound desc = could not find container \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": container with ID starting with b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.577602 4610 scope.go:117] "RemoveContainer" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.577772 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": container with ID starting with 6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077 not found: ID does not exist" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.577797 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} err="failed to get container status \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": rpc error: code = NotFound desc = could not find container \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": container with ID starting with 6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.577811 4610 scope.go:117] "RemoveContainer" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.578010 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": container with ID starting with 32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47 not found: ID does not exist" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578030 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} err="failed to get container status \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": rpc error: code = NotFound desc = could not find container \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": container with ID starting with 32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578056 4610 scope.go:117] "RemoveContainer" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.578268 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": container with ID starting with 47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09 not found: ID does not exist" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578289 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} err="failed to get container status \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": rpc error: code = NotFound desc = could not find container \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": container with ID starting with 47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578305 4610 scope.go:117] "RemoveContainer" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578565 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} err="failed to get container status \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": rpc error: code = NotFound desc = could not find container \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": container with ID starting with b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578594 4610 scope.go:117] "RemoveContainer" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578850 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} err="failed to get container status \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": rpc error: code = NotFound desc = could not find container \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": container with ID starting with 6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.578883 4610 scope.go:117] "RemoveContainer" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579170 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} err="failed to get container status \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": rpc error: code = NotFound desc = could not find container \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": container with ID starting with 32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579190 4610 scope.go:117] "RemoveContainer" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579390 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} err="failed to get container status \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": rpc error: code = NotFound desc = could not find container \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": container with ID starting with 47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579409 4610 scope.go:117] "RemoveContainer" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579652 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} err="failed to get container status \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": rpc error: code = NotFound desc = could not find container \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": container with ID starting with b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579672 4610 scope.go:117] "RemoveContainer" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579885 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} err="failed to get container status \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": rpc error: code = NotFound desc = could not find container \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": container with ID starting with 6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.579903 4610 scope.go:117] "RemoveContainer" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580086 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} err="failed to get container status \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": rpc error: code = NotFound desc = could not find container \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": container with ID starting with 32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580109 4610 scope.go:117] "RemoveContainer" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580331 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} err="failed to get container status \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": rpc error: code = NotFound desc = could not find container \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": container with ID starting with 47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580344 4610 scope.go:117] "RemoveContainer" containerID="b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580648 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d"} err="failed to get container status \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": rpc error: code = NotFound desc = could not find container \"b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d\": container with ID starting with b1955d87a014f184bd404e9dadaf90e8ab84107410f13a098c83b38e9f32c82d not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580663 4610 scope.go:117] "RemoveContainer" containerID="6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.580998 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077"} err="failed to get container status \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": rpc error: code = NotFound desc = could not find container \"6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077\": container with ID starting with 6b986be62f41d48fb5bc5e70d932ba90e43091f6ca75a249f1cde54ae610a077 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.581021 4610 scope.go:117] "RemoveContainer" containerID="32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.581375 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47"} err="failed to get container status \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": rpc error: code = NotFound desc = could not find container \"32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47\": container with ID starting with 32f0a79e611b10d9c86840d0a9251473f69380826231d711dea583be31ad8a47 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.581398 4610 scope.go:117] "RemoveContainer" containerID="47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.582405 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09"} err="failed to get container status \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": rpc error: code = NotFound desc = could not find container \"47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09\": container with ID starting with 47b2cdaea5fa8f9a6544c753397fd803ff97ce000a5ec0077b25270792f11d09 not found: ID does not exist" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.621243 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.746560 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.767987 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838106 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.838548 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="proxy-httpd" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838569 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="proxy-httpd" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.838585 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-central-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838593 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-central-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.838615 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-notification-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838622 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-notification-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: E1006 09:02:19.838636 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="sg-core" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838643 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="sg-core" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838802 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="sg-core" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838819 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-central-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838875 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="proxy-httpd" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.838887 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" containerName="ceilometer-notification-agent" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.843731 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.844776 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.847266 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.848205 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.937713 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8b9a-account-create-jt5xg"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.938763 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.942850 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.959859 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8b9a-account-create-jt5xg"] Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978295 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4qr\" (UniqueName: \"kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr\") pod \"nova-api-8b9a-account-create-jt5xg\" (UID: \"4a39c855-fc17-4cc4-af66-dc39f28fc009\") " pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978380 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978443 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhw7\" (UniqueName: \"kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978467 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978486 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978508 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978538 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:19 crc kubenswrapper[4610]: I1006 09:02:19.978561 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079603 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079659 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079679 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079713 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4qr\" (UniqueName: \"kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr\") pod \"nova-api-8b9a-account-create-jt5xg\" (UID: \"4a39c855-fc17-4cc4-af66-dc39f28fc009\") " pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079773 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079853 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhw7\" (UniqueName: \"kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.079984 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.080007 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.080314 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.080343 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.087340 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.089223 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.089705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.089776 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.098504 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhw7\" (UniqueName: \"kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7\") pod \"ceilometer-0\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.105584 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4qr\" (UniqueName: \"kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr\") pod \"nova-api-8b9a-account-create-jt5xg\" (UID: \"4a39c855-fc17-4cc4-af66-dc39f28fc009\") " pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.134634 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9537-account-create-4p6r5"] Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.135691 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.137709 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.164871 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9537-account-create-4p6r5"] Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.184369 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.185982 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvxc\" (UniqueName: \"kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc\") pod \"nova-cell0-9537-account-create-4p6r5\" (UID: \"b77db1d8-8747-4703-9eb7-80037220ecde\") " pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.264448 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.290748 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvxc\" (UniqueName: \"kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc\") pod \"nova-cell0-9537-account-create-4p6r5\" (UID: \"b77db1d8-8747-4703-9eb7-80037220ecde\") " pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.316667 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvxc\" (UniqueName: \"kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc\") pod \"nova-cell0-9537-account-create-4p6r5\" (UID: \"b77db1d8-8747-4703-9eb7-80037220ecde\") " pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.378650 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-844b-account-create-n8hvs"] Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.380385 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.390033 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.417124 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"225e171a-3dd8-4d73-af22-fa01ef4a7359","Type":"ContainerStarted","Data":"6a41be3e2e0b2f7d3d0c41d492a60a9870851d41bccef07c5b751e7fe665f743"} Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.417414 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-844b-account-create-n8hvs"] Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.465320 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.495174 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htfj\" (UniqueName: \"kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj\") pod \"nova-cell1-844b-account-create-n8hvs\" (UID: \"37db7d0c-8148-47dd-b730-b471fd07f6be\") " pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.599349 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htfj\" (UniqueName: \"kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj\") pod \"nova-cell1-844b-account-create-n8hvs\" (UID: \"37db7d0c-8148-47dd-b730-b471fd07f6be\") " pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.622257 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htfj\" (UniqueName: \"kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj\") pod \"nova-cell1-844b-account-create-n8hvs\" (UID: \"37db7d0c-8148-47dd-b730-b471fd07f6be\") " pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.766444 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.862239 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:20 crc kubenswrapper[4610]: W1006 09:02:20.867295 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f8aefa_9c2f_432d_a804_8bb380330379.slice/crio-ff073f2d4c081fada2883b37f35160e914145ac4ce636d506e8b5a06696d8058 WatchSource:0}: Error finding container ff073f2d4c081fada2883b37f35160e914145ac4ce636d506e8b5a06696d8058: Status 404 returned error can't find the container with id ff073f2d4c081fada2883b37f35160e914145ac4ce636d506e8b5a06696d8058 Oct 06 09:02:20 crc kubenswrapper[4610]: I1006 09:02:20.960132 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8b9a-account-create-jt5xg"] Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.116716 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0506c59-024d-4c05-aae8-84e6c01075a0" path="/var/lib/kubelet/pods/f0506c59-024d-4c05-aae8-84e6c01075a0/volumes" Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.117795 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9537-account-create-4p6r5"] Oct 06 09:02:21 crc kubenswrapper[4610]: W1006 09:02:21.165011 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb77db1d8_8747_4703_9eb7_80037220ecde.slice/crio-973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753 WatchSource:0}: Error finding container 973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753: Status 404 returned error can't find the container with id 973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753 Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.343575 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-844b-account-create-n8hvs"] Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.445555 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-844b-account-create-n8hvs" event={"ID":"37db7d0c-8148-47dd-b730-b471fd07f6be","Type":"ContainerStarted","Data":"fe07f157f313b86b66f86bf2cc670bd19988395131cb036537ad3f8d1a67fc4a"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.449030 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9537-account-create-4p6r5" event={"ID":"b77db1d8-8747-4703-9eb7-80037220ecde","Type":"ContainerStarted","Data":"973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.452228 4610 generic.go:334] "Generic (PLEG): container finished" podID="4a39c855-fc17-4cc4-af66-dc39f28fc009" containerID="3f323c5f23ca573f4bf0fe819e8c7276efae1d8ad46ba6ec6fecde023a07678c" exitCode=0 Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.452524 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8b9a-account-create-jt5xg" event={"ID":"4a39c855-fc17-4cc4-af66-dc39f28fc009","Type":"ContainerDied","Data":"3f323c5f23ca573f4bf0fe819e8c7276efae1d8ad46ba6ec6fecde023a07678c"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.452889 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8b9a-account-create-jt5xg" event={"ID":"4a39c855-fc17-4cc4-af66-dc39f28fc009","Type":"ContainerStarted","Data":"e5470e861c0c9aa07aa204c93d2ba94dc7478dca0a86ef24a55bf722114ece7f"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.456164 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"225e171a-3dd8-4d73-af22-fa01ef4a7359","Type":"ContainerStarted","Data":"ebc136551e513f53e2acbe42e265e9dd9df3f060c907d7a98bef73c4a225dc54"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.457557 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerStarted","Data":"ff073f2d4c081fada2883b37f35160e914145ac4ce636d506e8b5a06696d8058"} Oct 06 09:02:21 crc kubenswrapper[4610]: I1006 09:02:21.974739 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.468362 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"225e171a-3dd8-4d73-af22-fa01ef4a7359","Type":"ContainerStarted","Data":"ea5755636bed73666f58300c75e3a99b1339398f12949f516d8d585378ecc3c1"} Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.470854 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerStarted","Data":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.473295 4610 generic.go:334] "Generic (PLEG): container finished" podID="37db7d0c-8148-47dd-b730-b471fd07f6be" containerID="4c9235706a689e81928754fc27d5d22c21347038c035b4371958dd2d2d7b1be4" exitCode=0 Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.473405 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-844b-account-create-n8hvs" event={"ID":"37db7d0c-8148-47dd-b730-b471fd07f6be","Type":"ContainerDied","Data":"4c9235706a689e81928754fc27d5d22c21347038c035b4371958dd2d2d7b1be4"} Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.477596 4610 generic.go:334] "Generic (PLEG): container finished" podID="b77db1d8-8747-4703-9eb7-80037220ecde" containerID="97ae70e2be6928e0392c0c63ed74a6b0bdaabeed2465d4ad6f18deaf921c2765" exitCode=0 Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.477648 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9537-account-create-4p6r5" event={"ID":"b77db1d8-8747-4703-9eb7-80037220ecde","Type":"ContainerDied","Data":"97ae70e2be6928e0392c0c63ed74a6b0bdaabeed2465d4ad6f18deaf921c2765"} Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.490915 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.491010 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.512366 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.514982 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.514969803 podStartE2EDuration="4.514969803s" podCreationTimestamp="2025-10-06 09:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:22.495031202 +0000 UTC m=+1274.210084600" watchObservedRunningTime="2025-10-06 09:02:22.514969803 +0000 UTC m=+1274.230023191" Oct 06 09:02:22 crc kubenswrapper[4610]: I1006 09:02:22.872769 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.048284 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj4qr\" (UniqueName: \"kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr\") pod \"4a39c855-fc17-4cc4-af66-dc39f28fc009\" (UID: \"4a39c855-fc17-4cc4-af66-dc39f28fc009\") " Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.054374 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr" (OuterVolumeSpecName: "kube-api-access-qj4qr") pod "4a39c855-fc17-4cc4-af66-dc39f28fc009" (UID: "4a39c855-fc17-4cc4-af66-dc39f28fc009"). InnerVolumeSpecName "kube-api-access-qj4qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.150014 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj4qr\" (UniqueName: \"kubernetes.io/projected/4a39c855-fc17-4cc4-af66-dc39f28fc009-kube-api-access-qj4qr\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.487551 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerStarted","Data":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.487604 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerStarted","Data":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.489686 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8b9a-account-create-jt5xg" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.489749 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8b9a-account-create-jt5xg" event={"ID":"4a39c855-fc17-4cc4-af66-dc39f28fc009","Type":"ContainerDied","Data":"e5470e861c0c9aa07aa204c93d2ba94dc7478dca0a86ef24a55bf722114ece7f"} Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.489813 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5470e861c0c9aa07aa204c93d2ba94dc7478dca0a86ef24a55bf722114ece7f" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.797629 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.978393 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.980630 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nvxc\" (UniqueName: \"kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc\") pod \"b77db1d8-8747-4703-9eb7-80037220ecde\" (UID: \"b77db1d8-8747-4703-9eb7-80037220ecde\") " Oct 06 09:02:23 crc kubenswrapper[4610]: I1006 09:02:23.993858 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc" (OuterVolumeSpecName: "kube-api-access-7nvxc") pod "b77db1d8-8747-4703-9eb7-80037220ecde" (UID: "b77db1d8-8747-4703-9eb7-80037220ecde"). InnerVolumeSpecName "kube-api-access-7nvxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.082628 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9htfj\" (UniqueName: \"kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj\") pod \"37db7d0c-8148-47dd-b730-b471fd07f6be\" (UID: \"37db7d0c-8148-47dd-b730-b471fd07f6be\") " Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.083105 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nvxc\" (UniqueName: \"kubernetes.io/projected/b77db1d8-8747-4703-9eb7-80037220ecde-kube-api-access-7nvxc\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.085802 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj" (OuterVolumeSpecName: "kube-api-access-9htfj") pod "37db7d0c-8148-47dd-b730-b471fd07f6be" (UID: "37db7d0c-8148-47dd-b730-b471fd07f6be"). InnerVolumeSpecName "kube-api-access-9htfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.184507 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9htfj\" (UniqueName: \"kubernetes.io/projected/37db7d0c-8148-47dd-b730-b471fd07f6be-kube-api-access-9htfj\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.499991 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9537-account-create-4p6r5" event={"ID":"b77db1d8-8747-4703-9eb7-80037220ecde","Type":"ContainerDied","Data":"973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753"} Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.500323 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="973229f393fc6b434e3184d5bf00e89895d08d5237d4e45c71d3c9c98ceb2753" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.500021 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9537-account-create-4p6r5" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504484 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerStarted","Data":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504642 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-central-agent" containerID="cri-o://0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" gracePeriod=30 Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504797 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="proxy-httpd" containerID="cri-o://f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" gracePeriod=30 Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504799 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="sg-core" containerID="cri-o://f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" gracePeriod=30 Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504821 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.504871 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-notification-agent" containerID="cri-o://e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" gracePeriod=30 Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.506944 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-844b-account-create-n8hvs" event={"ID":"37db7d0c-8148-47dd-b730-b471fd07f6be","Type":"ContainerDied","Data":"fe07f157f313b86b66f86bf2cc670bd19988395131cb036537ad3f8d1a67fc4a"} Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.506978 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe07f157f313b86b66f86bf2cc670bd19988395131cb036537ad3f8d1a67fc4a" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.507031 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-844b-account-create-n8hvs" Oct 06 09:02:24 crc kubenswrapper[4610]: I1006 09:02:24.834714 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.559842035 podStartE2EDuration="5.834696879s" podCreationTimestamp="2025-10-06 09:02:19 +0000 UTC" firstStartedPulling="2025-10-06 09:02:20.8731527 +0000 UTC m=+1272.588206088" lastFinishedPulling="2025-10-06 09:02:24.148007544 +0000 UTC m=+1275.863060932" observedRunningTime="2025-10-06 09:02:24.529639964 +0000 UTC m=+1276.244693362" watchObservedRunningTime="2025-10-06 09:02:24.834696879 +0000 UTC m=+1276.549750277" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.241808 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409252 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409320 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409398 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409483 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409528 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409548 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhw7\" (UniqueName: \"kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409604 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts\") pod \"03f8aefa-9c2f-432d-a804-8bb380330379\" (UID: \"03f8aefa-9c2f-432d-a804-8bb380330379\") " Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409810 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.409884 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.410503 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.410603 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03f8aefa-9c2f-432d-a804-8bb380330379-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.414111 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts" (OuterVolumeSpecName: "scripts") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.442366 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7" (OuterVolumeSpecName: "kube-api-access-tjhw7") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "kube-api-access-tjhw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.445403 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8mxh7"] Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.445910 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="proxy-httpd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.446260 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="proxy-httpd" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.446344 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77db1d8-8747-4703-9eb7-80037220ecde" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448052 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77db1d8-8747-4703-9eb7-80037220ecde" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.448186 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37db7d0c-8148-47dd-b730-b471fd07f6be" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448258 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="37db7d0c-8148-47dd-b730-b471fd07f6be" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.448333 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-notification-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448384 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-notification-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.448440 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a39c855-fc17-4cc4-af66-dc39f28fc009" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448486 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a39c855-fc17-4cc4-af66-dc39f28fc009" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.448556 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-central-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448617 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-central-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.448667 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="sg-core" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448718 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="sg-core" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.448971 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-notification-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449178 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="ceilometer-central-agent" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449231 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="proxy-httpd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449291 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" containerName="sg-core" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449348 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a39c855-fc17-4cc4-af66-dc39f28fc009" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449399 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="37db7d0c-8148-47dd-b730-b471fd07f6be" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.449463 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77db1d8-8747-4703-9eb7-80037220ecde" containerName="mariadb-account-create" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.450281 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.452741 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7pg5x" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.455547 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.455777 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.466291 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8mxh7"] Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.467723 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.512874 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.512925 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.512936 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhw7\" (UniqueName: \"kubernetes.io/projected/03f8aefa-9c2f-432d-a804-8bb380330379-kube-api-access-tjhw7\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521229 4610 generic.go:334] "Generic (PLEG): container finished" podID="03f8aefa-9c2f-432d-a804-8bb380330379" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" exitCode=0 Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521257 4610 generic.go:334] "Generic (PLEG): container finished" podID="03f8aefa-9c2f-432d-a804-8bb380330379" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" exitCode=2 Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521265 4610 generic.go:334] "Generic (PLEG): container finished" podID="03f8aefa-9c2f-432d-a804-8bb380330379" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" exitCode=0 Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521283 4610 generic.go:334] "Generic (PLEG): container finished" podID="03f8aefa-9c2f-432d-a804-8bb380330379" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" exitCode=0 Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521293 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521302 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerDied","Data":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521329 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerDied","Data":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521338 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerDied","Data":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521347 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerDied","Data":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521357 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03f8aefa-9c2f-432d-a804-8bb380330379","Type":"ContainerDied","Data":"ff073f2d4c081fada2883b37f35160e914145ac4ce636d506e8b5a06696d8058"} Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.521372 4610 scope.go:117] "RemoveContainer" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.591910 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.614544 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.614623 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtq7l\" (UniqueName: \"kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.614678 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.614700 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.614747 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.615158 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data" (OuterVolumeSpecName: "config-data") pod "03f8aefa-9c2f-432d-a804-8bb380330379" (UID: "03f8aefa-9c2f-432d-a804-8bb380330379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.617541 4610 scope.go:117] "RemoveContainer" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.647921 4610 scope.go:117] "RemoveContainer" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.678728 4610 scope.go:117] "RemoveContainer" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.697096 4610 scope.go:117] "RemoveContainer" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.697576 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": container with ID starting with f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7 not found: ID does not exist" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.697607 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} err="failed to get container status \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": rpc error: code = NotFound desc = could not find container \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": container with ID starting with f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.697632 4610 scope.go:117] "RemoveContainer" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.697911 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": container with ID starting with f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd not found: ID does not exist" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.697955 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} err="failed to get container status \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": rpc error: code = NotFound desc = could not find container \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": container with ID starting with f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.697984 4610 scope.go:117] "RemoveContainer" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.698251 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": container with ID starting with e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7 not found: ID does not exist" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698275 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} err="failed to get container status \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": rpc error: code = NotFound desc = could not find container \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": container with ID starting with e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698288 4610 scope.go:117] "RemoveContainer" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: E1006 09:02:25.698613 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": container with ID starting with 0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5 not found: ID does not exist" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698654 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} err="failed to get container status \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": rpc error: code = NotFound desc = could not find container \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": container with ID starting with 0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698667 4610 scope.go:117] "RemoveContainer" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698841 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} err="failed to get container status \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": rpc error: code = NotFound desc = could not find container \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": container with ID starting with f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.698864 4610 scope.go:117] "RemoveContainer" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699166 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} err="failed to get container status \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": rpc error: code = NotFound desc = could not find container \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": container with ID starting with f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699183 4610 scope.go:117] "RemoveContainer" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699374 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} err="failed to get container status \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": rpc error: code = NotFound desc = could not find container \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": container with ID starting with e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699390 4610 scope.go:117] "RemoveContainer" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699670 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} err="failed to get container status \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": rpc error: code = NotFound desc = could not find container \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": container with ID starting with 0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699692 4610 scope.go:117] "RemoveContainer" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699880 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} err="failed to get container status \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": rpc error: code = NotFound desc = could not find container \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": container with ID starting with f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.699899 4610 scope.go:117] "RemoveContainer" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700178 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} err="failed to get container status \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": rpc error: code = NotFound desc = could not find container \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": container with ID starting with f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700215 4610 scope.go:117] "RemoveContainer" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700382 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} err="failed to get container status \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": rpc error: code = NotFound desc = could not find container \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": container with ID starting with e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700399 4610 scope.go:117] "RemoveContainer" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700583 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} err="failed to get container status \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": rpc error: code = NotFound desc = could not find container \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": container with ID starting with 0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700601 4610 scope.go:117] "RemoveContainer" containerID="f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700797 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7"} err="failed to get container status \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": rpc error: code = NotFound desc = could not find container \"f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7\": container with ID starting with f8cf98a7333df4e0bdbbbbe37afa8e41f67dcf96cb26d044141be7d2317c88b7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.700814 4610 scope.go:117] "RemoveContainer" containerID="f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.701097 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd"} err="failed to get container status \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": rpc error: code = NotFound desc = could not find container \"f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd\": container with ID starting with f67fbbb45758c832c93ff397c1323ed72630f774abc869b18f18e5375e3752cd not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.701115 4610 scope.go:117] "RemoveContainer" containerID="e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.701298 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7"} err="failed to get container status \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": rpc error: code = NotFound desc = could not find container \"e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7\": container with ID starting with e76bbde4379b4d78adcec44f15cf3f49ba096d0840c4bc4335f19afb027d63e7 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.701313 4610 scope.go:117] "RemoveContainer" containerID="0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.701510 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5"} err="failed to get container status \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": rpc error: code = NotFound desc = could not find container \"0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5\": container with ID starting with 0ddb247886d01b14c37912197e76e01f72f2cd997a873ee5de8d04eff9296ca5 not found: ID does not exist" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.716717 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtq7l\" (UniqueName: \"kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.716834 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.716874 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.716948 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.717017 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8aefa-9c2f-432d-a804-8bb380330379-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.720927 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.721319 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.722209 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.739993 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtq7l\" (UniqueName: \"kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l\") pod \"nova-cell0-conductor-db-sync-8mxh7\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.849343 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.859629 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.878710 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.880702 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.885553 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.885748 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.901696 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:25 crc kubenswrapper[4610]: I1006 09:02:25.910281 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.021713 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.021931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.021973 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.022008 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.022090 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.022119 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.022246 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zt88\" (UniqueName: \"kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124374 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124432 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124480 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124525 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124570 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124603 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124634 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zt88\" (UniqueName: \"kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.124966 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.125545 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.131768 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.132280 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.132423 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.133195 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.164107 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zt88\" (UniqueName: \"kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88\") pod \"ceilometer-0\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.199257 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.233958 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8mxh7"] Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.530495 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" event={"ID":"16262603-bfa5-485b-bcbc-c61b3390f964","Type":"ContainerStarted","Data":"e0f02daca1f1f37d13417072be4e552e6c4f5d57901ea2875d581b34008273c1"} Oct 06 09:02:26 crc kubenswrapper[4610]: I1006 09:02:26.722389 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:27 crc kubenswrapper[4610]: I1006 09:02:27.082886 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f8aefa-9c2f-432d-a804-8bb380330379" path="/var/lib/kubelet/pods/03f8aefa-9c2f-432d-a804-8bb380330379/volumes" Oct 06 09:02:27 crc kubenswrapper[4610]: I1006 09:02:27.544627 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerStarted","Data":"ad525c55f4bdff3e8d2bc31cca3f093bb7eabb11b7a763c89493e166db2f19bd"} Oct 06 09:02:27 crc kubenswrapper[4610]: I1006 09:02:27.899285 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 09:02:28 crc kubenswrapper[4610]: I1006 09:02:28.564007 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerStarted","Data":"b4c9330171368c0c66ff1684524199cc924ec2e104805edc19c85f5d1cd2c51b"} Oct 06 09:02:28 crc kubenswrapper[4610]: I1006 09:02:28.564595 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerStarted","Data":"72aaeb8aa8dcb0e339df7beaf18e9bdbc00abbcb0c2587d2800ef886bf5d8529"} Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.004868 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.028847 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.029094 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.119327 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.119457 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.574276 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerStarted","Data":"69443c7b9f8b7707263cc361d3c27ce3ce6bc03f216a5ccddc37d666652b630d"} Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.574324 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:02:29 crc kubenswrapper[4610]: I1006 09:02:29.574507 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601054 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601590 4610 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601010 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerStarted","Data":"b5c4e18ff18e725fd67289dccb72930091717240cac03688902806c609d07cca"} Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601224 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-notification-agent" containerID="cri-o://b4c9330171368c0c66ff1684524199cc924ec2e104805edc19c85f5d1cd2c51b" gracePeriod=30 Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601218 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="sg-core" containerID="cri-o://69443c7b9f8b7707263cc361d3c27ce3ce6bc03f216a5ccddc37d666652b630d" gracePeriod=30 Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601249 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="proxy-httpd" containerID="cri-o://b5c4e18ff18e725fd67289dccb72930091717240cac03688902806c609d07cca" gracePeriod=30 Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.601167 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-central-agent" containerID="cri-o://72aaeb8aa8dcb0e339df7beaf18e9bdbc00abbcb0c2587d2800ef886bf5d8529" gracePeriod=30 Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.602024 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.631384 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9744028179999997 podStartE2EDuration="6.631367424s" podCreationTimestamp="2025-10-06 09:02:25 +0000 UTC" firstStartedPulling="2025-10-06 09:02:26.713063755 +0000 UTC m=+1278.428117143" lastFinishedPulling="2025-10-06 09:02:30.370028361 +0000 UTC m=+1282.085081749" observedRunningTime="2025-10-06 09:02:31.629801634 +0000 UTC m=+1283.344855042" watchObservedRunningTime="2025-10-06 09:02:31.631367424 +0000 UTC m=+1283.346420832" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.974637 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:02:31 crc kubenswrapper[4610]: I1006 09:02:31.982475 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611326 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerID="b5c4e18ff18e725fd67289dccb72930091717240cac03688902806c609d07cca" exitCode=0 Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611593 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerID="69443c7b9f8b7707263cc361d3c27ce3ce6bc03f216a5ccddc37d666652b630d" exitCode=2 Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611603 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerID="b4c9330171368c0c66ff1684524199cc924ec2e104805edc19c85f5d1cd2c51b" exitCode=0 Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611373 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerDied","Data":"b5c4e18ff18e725fd67289dccb72930091717240cac03688902806c609d07cca"} Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611683 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerDied","Data":"69443c7b9f8b7707263cc361d3c27ce3ce6bc03f216a5ccddc37d666652b630d"} Oct 06 09:02:32 crc kubenswrapper[4610]: I1006 09:02:32.611698 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerDied","Data":"b4c9330171368c0c66ff1684524199cc924ec2e104805edc19c85f5d1cd2c51b"} Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.655078 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" event={"ID":"16262603-bfa5-485b-bcbc-c61b3390f964","Type":"ContainerStarted","Data":"ffc63b5c1a5a389db05fa90cde7d066e8b58f29bd02a274b951ca374e62b233b"} Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.661627 4610 generic.go:334] "Generic (PLEG): container finished" podID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerID="72aaeb8aa8dcb0e339df7beaf18e9bdbc00abbcb0c2587d2800ef886bf5d8529" exitCode=0 Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.661660 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerDied","Data":"72aaeb8aa8dcb0e339df7beaf18e9bdbc00abbcb0c2587d2800ef886bf5d8529"} Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.747245 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.770424 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" podStartSLOduration=1.627837014 podStartE2EDuration="12.770409225s" podCreationTimestamp="2025-10-06 09:02:25 +0000 UTC" firstStartedPulling="2025-10-06 09:02:26.245139418 +0000 UTC m=+1277.960192816" lastFinishedPulling="2025-10-06 09:02:37.387711639 +0000 UTC m=+1289.102765027" observedRunningTime="2025-10-06 09:02:37.675121681 +0000 UTC m=+1289.390175089" watchObservedRunningTime="2025-10-06 09:02:37.770409225 +0000 UTC m=+1289.485462613" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855008 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855242 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855338 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855397 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855439 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zt88\" (UniqueName: \"kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855532 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855591 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data\") pod \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\" (UID: \"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31\") " Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.855919 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.856189 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.856262 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.862334 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts" (OuterVolumeSpecName: "scripts") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.866834 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88" (OuterVolumeSpecName: "kube-api-access-9zt88") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "kube-api-access-9zt88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.903150 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.942616 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.958541 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.958821 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.958887 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.958942 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zt88\" (UniqueName: \"kubernetes.io/projected/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-kube-api-access-9zt88\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.959056 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:37 crc kubenswrapper[4610]: I1006 09:02:37.978199 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data" (OuterVolumeSpecName: "config-data") pod "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" (UID: "8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.060281 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.674070 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31","Type":"ContainerDied","Data":"ad525c55f4bdff3e8d2bc31cca3f093bb7eabb11b7a763c89493e166db2f19bd"} Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.674404 4610 scope.go:117] "RemoveContainer" containerID="b5c4e18ff18e725fd67289dccb72930091717240cac03688902806c609d07cca" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.674092 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.717936 4610 scope.go:117] "RemoveContainer" containerID="69443c7b9f8b7707263cc361d3c27ce3ce6bc03f216a5ccddc37d666652b630d" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.719321 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.731378 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.745835 4610 scope.go:117] "RemoveContainer" containerID="b4c9330171368c0c66ff1684524199cc924ec2e104805edc19c85f5d1cd2c51b" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.755545 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:38 crc kubenswrapper[4610]: E1006 09:02:38.755917 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-notification-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.755933 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-notification-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: E1006 09:02:38.755963 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="sg-core" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.755970 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="sg-core" Oct 06 09:02:38 crc kubenswrapper[4610]: E1006 09:02:38.755984 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="proxy-httpd" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.755991 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="proxy-httpd" Oct 06 09:02:38 crc kubenswrapper[4610]: E1006 09:02:38.756000 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-central-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.756006 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-central-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.756191 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="sg-core" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.756209 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-notification-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.756221 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="proxy-httpd" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.756235 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" containerName="ceilometer-central-agent" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.757948 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.761652 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.761788 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.774469 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.787753 4610 scope.go:117] "RemoveContainer" containerID="72aaeb8aa8dcb0e339df7beaf18e9bdbc00abbcb0c2587d2800ef886bf5d8529" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875245 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875320 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhxz\" (UniqueName: \"kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875354 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875394 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875510 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875582 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.875616 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.976828 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.976889 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhxz\" (UniqueName: \"kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.976914 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.976939 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.977009 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.977098 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.977133 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.978399 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.978500 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.984003 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.984138 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.987771 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:38 crc kubenswrapper[4610]: I1006 09:02:38.992414 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:39 crc kubenswrapper[4610]: I1006 09:02:39.000747 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhxz\" (UniqueName: \"kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz\") pod \"ceilometer-0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " pod="openstack/ceilometer-0" Oct 06 09:02:39 crc kubenswrapper[4610]: I1006 09:02:39.080202 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31" path="/var/lib/kubelet/pods/8c721ca3-1d14-430c-bb4a-7e2c4f2c2e31/volumes" Oct 06 09:02:39 crc kubenswrapper[4610]: I1006 09:02:39.090181 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:02:39 crc kubenswrapper[4610]: I1006 09:02:39.558551 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:02:39 crc kubenswrapper[4610]: W1006 09:02:39.562896 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13761fa8_a633_4047_a034_12077c20d9f0.slice/crio-19e5534c72d577602a57a72b1d52f0a0b0f420442e8dd6272ac5e4b087b974e2 WatchSource:0}: Error finding container 19e5534c72d577602a57a72b1d52f0a0b0f420442e8dd6272ac5e4b087b974e2: Status 404 returned error can't find the container with id 19e5534c72d577602a57a72b1d52f0a0b0f420442e8dd6272ac5e4b087b974e2 Oct 06 09:02:39 crc kubenswrapper[4610]: I1006 09:02:39.684189 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerStarted","Data":"19e5534c72d577602a57a72b1d52f0a0b0f420442e8dd6272ac5e4b087b974e2"} Oct 06 09:02:40 crc kubenswrapper[4610]: I1006 09:02:40.694704 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerStarted","Data":"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926"} Oct 06 09:02:41 crc kubenswrapper[4610]: I1006 09:02:41.709882 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerStarted","Data":"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266"} Oct 06 09:02:42 crc kubenswrapper[4610]: I1006 09:02:42.721306 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerStarted","Data":"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1"} Oct 06 09:02:43 crc kubenswrapper[4610]: I1006 09:02:43.735083 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerStarted","Data":"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c"} Oct 06 09:02:43 crc kubenswrapper[4610]: I1006 09:02:43.735452 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:02:43 crc kubenswrapper[4610]: I1006 09:02:43.760465 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.548722873 podStartE2EDuration="5.760445172s" podCreationTimestamp="2025-10-06 09:02:38 +0000 UTC" firstStartedPulling="2025-10-06 09:02:39.564524894 +0000 UTC m=+1291.279578282" lastFinishedPulling="2025-10-06 09:02:42.776247193 +0000 UTC m=+1294.491300581" observedRunningTime="2025-10-06 09:02:43.756418851 +0000 UTC m=+1295.471472259" watchObservedRunningTime="2025-10-06 09:02:43.760445172 +0000 UTC m=+1295.475498600" Oct 06 09:02:49 crc kubenswrapper[4610]: I1006 09:02:49.789238 4610 generic.go:334] "Generic (PLEG): container finished" podID="16262603-bfa5-485b-bcbc-c61b3390f964" containerID="ffc63b5c1a5a389db05fa90cde7d066e8b58f29bd02a274b951ca374e62b233b" exitCode=0 Oct 06 09:02:49 crc kubenswrapper[4610]: I1006 09:02:49.789436 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" event={"ID":"16262603-bfa5-485b-bcbc-c61b3390f964","Type":"ContainerDied","Data":"ffc63b5c1a5a389db05fa90cde7d066e8b58f29bd02a274b951ca374e62b233b"} Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.185720 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.216264 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle\") pod \"16262603-bfa5-485b-bcbc-c61b3390f964\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.216403 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts\") pod \"16262603-bfa5-485b-bcbc-c61b3390f964\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.216468 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtq7l\" (UniqueName: \"kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l\") pod \"16262603-bfa5-485b-bcbc-c61b3390f964\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.216534 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data\") pod \"16262603-bfa5-485b-bcbc-c61b3390f964\" (UID: \"16262603-bfa5-485b-bcbc-c61b3390f964\") " Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.222729 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts" (OuterVolumeSpecName: "scripts") pod "16262603-bfa5-485b-bcbc-c61b3390f964" (UID: "16262603-bfa5-485b-bcbc-c61b3390f964"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.228263 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l" (OuterVolumeSpecName: "kube-api-access-gtq7l") pod "16262603-bfa5-485b-bcbc-c61b3390f964" (UID: "16262603-bfa5-485b-bcbc-c61b3390f964"). InnerVolumeSpecName "kube-api-access-gtq7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.250527 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16262603-bfa5-485b-bcbc-c61b3390f964" (UID: "16262603-bfa5-485b-bcbc-c61b3390f964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.261915 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data" (OuterVolumeSpecName: "config-data") pod "16262603-bfa5-485b-bcbc-c61b3390f964" (UID: "16262603-bfa5-485b-bcbc-c61b3390f964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.318025 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.318076 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.318088 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtq7l\" (UniqueName: \"kubernetes.io/projected/16262603-bfa5-485b-bcbc-c61b3390f964-kube-api-access-gtq7l\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.318099 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16262603-bfa5-485b-bcbc-c61b3390f964-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.810598 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" event={"ID":"16262603-bfa5-485b-bcbc-c61b3390f964","Type":"ContainerDied","Data":"e0f02daca1f1f37d13417072be4e552e6c4f5d57901ea2875d581b34008273c1"} Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.810644 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f02daca1f1f37d13417072be4e552e6c4f5d57901ea2875d581b34008273c1" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.810678 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8mxh7" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.947210 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 09:02:51 crc kubenswrapper[4610]: E1006 09:02:51.947561 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16262603-bfa5-485b-bcbc-c61b3390f964" containerName="nova-cell0-conductor-db-sync" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.947579 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="16262603-bfa5-485b-bcbc-c61b3390f964" containerName="nova-cell0-conductor-db-sync" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.947766 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="16262603-bfa5-485b-bcbc-c61b3390f964" containerName="nova-cell0-conductor-db-sync" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.948400 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.952679 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.953510 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7pg5x" Oct 06 09:02:51 crc kubenswrapper[4610]: I1006 09:02:51.972622 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.028505 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.028577 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.028666 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czksw\" (UniqueName: \"kubernetes.io/projected/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-kube-api-access-czksw\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.129956 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.130093 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czksw\" (UniqueName: \"kubernetes.io/projected/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-kube-api-access-czksw\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.130170 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.134994 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.137479 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.152360 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czksw\" (UniqueName: \"kubernetes.io/projected/bb2657e1-8319-4a7d-be1f-a48d66bd5ba8-kube-api-access-czksw\") pod \"nova-cell0-conductor-0\" (UID: \"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8\") " pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.268170 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:52 crc kubenswrapper[4610]: W1006 09:02:52.777163 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb2657e1_8319_4a7d_be1f_a48d66bd5ba8.slice/crio-e4efbed6de748258e78eaebcb72dca3b6175cf7b06c14eeef12585365af55de0 WatchSource:0}: Error finding container e4efbed6de748258e78eaebcb72dca3b6175cf7b06c14eeef12585365af55de0: Status 404 returned error can't find the container with id e4efbed6de748258e78eaebcb72dca3b6175cf7b06c14eeef12585365af55de0 Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.779740 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 09:02:52 crc kubenswrapper[4610]: I1006 09:02:52.820815 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8","Type":"ContainerStarted","Data":"e4efbed6de748258e78eaebcb72dca3b6175cf7b06c14eeef12585365af55de0"} Oct 06 09:02:53 crc kubenswrapper[4610]: I1006 09:02:53.837954 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb2657e1-8319-4a7d-be1f-a48d66bd5ba8","Type":"ContainerStarted","Data":"3bc706516bda694c730e3081817c082de5d421a75216dd54a16497c18be35439"} Oct 06 09:02:53 crc kubenswrapper[4610]: I1006 09:02:53.838480 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:53 crc kubenswrapper[4610]: I1006 09:02:53.880989 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.880963083 podStartE2EDuration="2.880963083s" podCreationTimestamp="2025-10-06 09:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:53.857713679 +0000 UTC m=+1305.572767087" watchObservedRunningTime="2025-10-06 09:02:53.880963083 +0000 UTC m=+1305.596016481" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.311459 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.805601 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m9pmg"] Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.807313 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.809885 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.809932 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.831754 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m9pmg"] Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.966895 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.967014 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwjt\" (UniqueName: \"kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.967105 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:57 crc kubenswrapper[4610]: I1006 09:02:57.967139 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.021976 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.023102 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.027399 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.069856 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.069915 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qgk\" (UniqueName: \"kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.069957 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.070075 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.070117 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.070202 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.070278 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwjt\" (UniqueName: \"kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.086611 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.094003 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.101926 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.176230 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84qgk\" (UniqueName: \"kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.176336 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.176417 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.195874 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.198521 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.219170 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.229825 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qgk\" (UniqueName: \"kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk\") pod \"nova-scheduler-0\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.254201 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.255279 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.276960 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.286241 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvn4\" (UniqueName: \"kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.286640 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.286666 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.300923 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwjt\" (UniqueName: \"kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt\") pod \"nova-cell0-cell-mapping-m9pmg\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.360853 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.388741 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.389883 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.389940 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.389971 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvn4\" (UniqueName: \"kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.398666 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.409124 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.432619 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.465696 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvn4\" (UniqueName: \"kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4\") pod \"nova-cell1-novncproxy-0\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.524329 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.542768 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.560171 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.567569 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.601278 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.601355 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkrw\" (UniqueName: \"kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.601379 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.601464 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.647471 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.706384 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svkrw\" (UniqueName: \"kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.706747 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.707110 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.707251 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.707751 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.720695 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.722468 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.730923 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.732538 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.759468 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.771720 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.815997 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.816087 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.816142 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndts\" (UniqueName: \"kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.816275 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.816731 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkrw\" (UniqueName: \"kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw\") pod \"nova-api-0\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.849938 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.855492 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.873178 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.901619 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918742 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918819 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndts\" (UniqueName: \"kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918862 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918883 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918921 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918937 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.918969 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zktmm\" (UniqueName: \"kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.919001 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.919036 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.922272 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.928917 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.938874 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndts\" (UniqueName: \"kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:58 crc kubenswrapper[4610]: I1006 09:02:58.944226 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data\") pod \"nova-metadata-0\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " pod="openstack/nova-metadata-0" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.019978 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020037 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020080 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zktmm\" (UniqueName: \"kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020129 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020193 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020211 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.020954 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.021446 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.021905 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.023323 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.023978 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.039918 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zktmm\" (UniqueName: \"kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm\") pod \"dnsmasq-dns-865f5d856f-fpkrp\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.122297 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.184438 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.297565 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.312981 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.342985 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m9pmg"] Oct 06 09:02:59 crc kubenswrapper[4610]: W1006 09:02:59.389794 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7411bbf_edb9_450b_ae3b_02ccaa0dd04a.slice/crio-1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9 WatchSource:0}: Error finding container 1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9: Status 404 returned error can't find the container with id 1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9 Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.484807 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.651670 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.853702 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.918980 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerStarted","Data":"34d3766f5d5ee3efacb4fb96a51765761061330bedf2835cdcf274f497017ceb"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.928484 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m9pmg" event={"ID":"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a","Type":"ContainerStarted","Data":"05f9f58f875be83673aae401d57ca88b0deefa5852be2171880ae839549e78d1"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.928534 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m9pmg" event={"ID":"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a","Type":"ContainerStarted","Data":"1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.931555 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"758e6b3d-66a1-488f-a861-5ee5ab3a7b56","Type":"ContainerStarted","Data":"f07673e0d62167090a2ea0726b12839f440a2063e72afce24eadeb372bc6e8b6"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.940180 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067412b2-e953-438f-b53e-af51eba5313f","Type":"ContainerStarted","Data":"3efd7f5ef283a674bb45e56177635b749fef3402c5bf752309318344b61cdf00"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.941918 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerStarted","Data":"acd81ebc85619ec99681e3d01e40e773b549a23c47bb75bb2c3caea1bbfe5ad5"} Oct 06 09:02:59 crc kubenswrapper[4610]: I1006 09:02:59.943567 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m9pmg" podStartSLOduration=2.943551593 podStartE2EDuration="2.943551593s" podCreationTimestamp="2025-10-06 09:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:02:59.940702191 +0000 UTC m=+1311.655755579" watchObservedRunningTime="2025-10-06 09:02:59.943551593 +0000 UTC m=+1311.658604971" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.006858 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vflrs"] Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.008175 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.010854 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.012797 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.015552 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.054615 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vflrs"] Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.056827 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.056933 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.057119 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpc8x\" (UniqueName: \"kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.057294 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.159465 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpc8x\" (UniqueName: \"kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.159564 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.159635 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.159656 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.165511 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.172210 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.177466 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.181003 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpc8x\" (UniqueName: \"kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x\") pod \"nova-cell1-conductor-db-sync-vflrs\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.345776 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.961880 4610 generic.go:334] "Generic (PLEG): container finished" podID="73c3f46d-9104-485f-8e6e-23b414a33760" containerID="8be2e7ecda79f2b21dbce606dfcf749fe3edeb5ab7edfefa61f1a950a575a13e" exitCode=0 Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.962348 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" event={"ID":"73c3f46d-9104-485f-8e6e-23b414a33760","Type":"ContainerDied","Data":"8be2e7ecda79f2b21dbce606dfcf749fe3edeb5ab7edfefa61f1a950a575a13e"} Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.962626 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" event={"ID":"73c3f46d-9104-485f-8e6e-23b414a33760","Type":"ContainerStarted","Data":"d6dd02b4cd9541c04b57226a9238eac43fab096cb7e0e90f56236d4983c70f1f"} Oct 06 09:03:00 crc kubenswrapper[4610]: I1006 09:03:00.982907 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vflrs"] Oct 06 09:03:00 crc kubenswrapper[4610]: W1006 09:03:00.999191 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6ed6294_5577_470e_8571_199cc7cc777d.slice/crio-bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4 WatchSource:0}: Error finding container bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4: Status 404 returned error can't find the container with id bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4 Oct 06 09:03:01 crc kubenswrapper[4610]: I1006 09:03:01.980908 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vflrs" event={"ID":"f6ed6294-5577-470e-8571-199cc7cc777d","Type":"ContainerStarted","Data":"ca5fdc62c020cdfbae7862d83298cb2627b8031bbecc60a7d4942edf1bab1b9a"} Oct 06 09:03:01 crc kubenswrapper[4610]: I1006 09:03:01.981167 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vflrs" event={"ID":"f6ed6294-5577-470e-8571-199cc7cc777d","Type":"ContainerStarted","Data":"bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4"} Oct 06 09:03:01 crc kubenswrapper[4610]: I1006 09:03:01.982792 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" event={"ID":"73c3f46d-9104-485f-8e6e-23b414a33760","Type":"ContainerStarted","Data":"a5b2e9c60aa1f13426f74250ac90c5b3ba2d4414eb9d8daf9c9bae9a526d60ca"} Oct 06 09:03:01 crc kubenswrapper[4610]: I1006 09:03:01.982933 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:03:01 crc kubenswrapper[4610]: I1006 09:03:01.998628 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vflrs" podStartSLOduration=2.998609629 podStartE2EDuration="2.998609629s" podCreationTimestamp="2025-10-06 09:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:01.991405558 +0000 UTC m=+1313.706458946" watchObservedRunningTime="2025-10-06 09:03:01.998609629 +0000 UTC m=+1313.713663017" Oct 06 09:03:02 crc kubenswrapper[4610]: I1006 09:03:02.018355 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" podStartSLOduration=4.018309904 podStartE2EDuration="4.018309904s" podCreationTimestamp="2025-10-06 09:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:02.009706378 +0000 UTC m=+1313.724759766" watchObservedRunningTime="2025-10-06 09:03:02.018309904 +0000 UTC m=+1313.733363302" Oct 06 09:03:02 crc kubenswrapper[4610]: I1006 09:03:02.472594 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:02 crc kubenswrapper[4610]: I1006 09:03:02.487407 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.004024 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerStarted","Data":"2949f39bb9c0ccd2c3f3e629246725f6f64c20c8708e9c50698e0e21b9f4780e"} Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.004472 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerStarted","Data":"1af331a1cbacd4d93576dcd593c82a0f47aeb5008c181d6d0ec5b2aa350024ce"} Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.006207 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerStarted","Data":"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8"} Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.007780 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"758e6b3d-66a1-488f-a861-5ee5ab3a7b56","Type":"ContainerStarted","Data":"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3"} Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.007844 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3" gracePeriod=30 Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.011244 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067412b2-e953-438f-b53e-af51eba5313f","Type":"ContainerStarted","Data":"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e"} Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.027994 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.391808996 podStartE2EDuration="6.027976909s" podCreationTimestamp="2025-10-06 09:02:58 +0000 UTC" firstStartedPulling="2025-10-06 09:02:59.680610686 +0000 UTC m=+1311.395664074" lastFinishedPulling="2025-10-06 09:03:03.316778609 +0000 UTC m=+1315.031831987" observedRunningTime="2025-10-06 09:03:04.023969559 +0000 UTC m=+1315.739022947" watchObservedRunningTime="2025-10-06 09:03:04.027976909 +0000 UTC m=+1315.743030297" Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.043480 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.235823687 podStartE2EDuration="6.043464028s" podCreationTimestamp="2025-10-06 09:02:58 +0000 UTC" firstStartedPulling="2025-10-06 09:02:59.506983834 +0000 UTC m=+1311.222037212" lastFinishedPulling="2025-10-06 09:03:03.314624165 +0000 UTC m=+1315.029677553" observedRunningTime="2025-10-06 09:03:04.041582461 +0000 UTC m=+1315.756635869" watchObservedRunningTime="2025-10-06 09:03:04.043464028 +0000 UTC m=+1315.758517416" Oct 06 09:03:04 crc kubenswrapper[4610]: I1006 09:03:04.063230 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.059448876 podStartE2EDuration="7.063212865s" podCreationTimestamp="2025-10-06 09:02:57 +0000 UTC" firstStartedPulling="2025-10-06 09:02:59.312790915 +0000 UTC m=+1311.027844303" lastFinishedPulling="2025-10-06 09:03:03.316554904 +0000 UTC m=+1315.031608292" observedRunningTime="2025-10-06 09:03:04.059792589 +0000 UTC m=+1315.774845977" watchObservedRunningTime="2025-10-06 09:03:04.063212865 +0000 UTC m=+1315.778266253" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.033794 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerStarted","Data":"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d"} Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.034233 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-log" containerID="cri-o://40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" gracePeriod=30 Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.034618 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-metadata" containerID="cri-o://43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" gracePeriod=30 Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.059684 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.595720355 podStartE2EDuration="7.059635741s" podCreationTimestamp="2025-10-06 09:02:58 +0000 UTC" firstStartedPulling="2025-10-06 09:02:59.860317161 +0000 UTC m=+1311.575370549" lastFinishedPulling="2025-10-06 09:03:03.324232547 +0000 UTC m=+1315.039285935" observedRunningTime="2025-10-06 09:03:05.052289807 +0000 UTC m=+1316.767343195" watchObservedRunningTime="2025-10-06 09:03:05.059635741 +0000 UTC m=+1316.774689129" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.669235 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.738109 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndts\" (UniqueName: \"kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts\") pod \"596498f9-ea67-4acc-8adb-51defd92ddcf\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.738180 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle\") pod \"596498f9-ea67-4acc-8adb-51defd92ddcf\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.738284 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data\") pod \"596498f9-ea67-4acc-8adb-51defd92ddcf\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.738396 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs\") pod \"596498f9-ea67-4acc-8adb-51defd92ddcf\" (UID: \"596498f9-ea67-4acc-8adb-51defd92ddcf\") " Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.738744 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs" (OuterVolumeSpecName: "logs") pod "596498f9-ea67-4acc-8adb-51defd92ddcf" (UID: "596498f9-ea67-4acc-8adb-51defd92ddcf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.739179 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596498f9-ea67-4acc-8adb-51defd92ddcf-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.762773 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts" (OuterVolumeSpecName: "kube-api-access-jndts") pod "596498f9-ea67-4acc-8adb-51defd92ddcf" (UID: "596498f9-ea67-4acc-8adb-51defd92ddcf"). InnerVolumeSpecName "kube-api-access-jndts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.777554 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "596498f9-ea67-4acc-8adb-51defd92ddcf" (UID: "596498f9-ea67-4acc-8adb-51defd92ddcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.780978 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data" (OuterVolumeSpecName: "config-data") pod "596498f9-ea67-4acc-8adb-51defd92ddcf" (UID: "596498f9-ea67-4acc-8adb-51defd92ddcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.840796 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.841014 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jndts\" (UniqueName: \"kubernetes.io/projected/596498f9-ea67-4acc-8adb-51defd92ddcf-kube-api-access-jndts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:05 crc kubenswrapper[4610]: I1006 09:03:05.841089 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596498f9-ea67-4acc-8adb-51defd92ddcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046805 4610 generic.go:334] "Generic (PLEG): container finished" podID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerID="43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" exitCode=0 Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046831 4610 generic.go:334] "Generic (PLEG): container finished" podID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerID="40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" exitCode=143 Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046849 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerDied","Data":"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d"} Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046874 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerDied","Data":"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8"} Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046884 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"596498f9-ea67-4acc-8adb-51defd92ddcf","Type":"ContainerDied","Data":"34d3766f5d5ee3efacb4fb96a51765761061330bedf2835cdcf274f497017ceb"} Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.046900 4610 scope.go:117] "RemoveContainer" containerID="43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.047011 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.086338 4610 scope.go:117] "RemoveContainer" containerID="40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.095160 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.105770 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.121326 4610 scope.go:117] "RemoveContainer" containerID="43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" Oct 06 09:03:06 crc kubenswrapper[4610]: E1006 09:03:06.121708 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d\": container with ID starting with 43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d not found: ID does not exist" containerID="43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.121740 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d"} err="failed to get container status \"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d\": rpc error: code = NotFound desc = could not find container \"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d\": container with ID starting with 43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d not found: ID does not exist" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.121767 4610 scope.go:117] "RemoveContainer" containerID="40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" Oct 06 09:03:06 crc kubenswrapper[4610]: E1006 09:03:06.122022 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8\": container with ID starting with 40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8 not found: ID does not exist" containerID="40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.122129 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8"} err="failed to get container status \"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8\": rpc error: code = NotFound desc = could not find container \"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8\": container with ID starting with 40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8 not found: ID does not exist" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.122152 4610 scope.go:117] "RemoveContainer" containerID="43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.122387 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d"} err="failed to get container status \"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d\": rpc error: code = NotFound desc = could not find container \"43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d\": container with ID starting with 43d6c950e9d826b8be191f34474cdef98313f559ef39c5daa791735ff693b83d not found: ID does not exist" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.122413 4610 scope.go:117] "RemoveContainer" containerID="40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.122592 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8"} err="failed to get container status \"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8\": rpc error: code = NotFound desc = could not find container \"40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8\": container with ID starting with 40abca63e166fa0635e9264f884e79e1a12602aa1d6e82539962fe14334186b8 not found: ID does not exist" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.133946 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:06 crc kubenswrapper[4610]: E1006 09:03:06.134394 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-metadata" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.134415 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-metadata" Oct 06 09:03:06 crc kubenswrapper[4610]: E1006 09:03:06.134442 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-log" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.134450 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-log" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.135781 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-log" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.135805 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" containerName="nova-metadata-metadata" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.136945 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.139672 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.139791 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.157846 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.246720 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.246894 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjxn\" (UniqueName: \"kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.246931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.246980 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.247036 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.348799 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjxn\" (UniqueName: \"kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.348860 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.348958 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.348991 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.349024 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.350451 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.354551 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.354794 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.360402 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.372860 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjxn\" (UniqueName: \"kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn\") pod \"nova-metadata-0\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.466151 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:06 crc kubenswrapper[4610]: I1006 09:03:06.993249 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:07 crc kubenswrapper[4610]: I1006 09:03:07.110297 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596498f9-ea67-4acc-8adb-51defd92ddcf" path="/var/lib/kubelet/pods/596498f9-ea67-4acc-8adb-51defd92ddcf/volumes" Oct 06 09:03:07 crc kubenswrapper[4610]: I1006 09:03:07.111065 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerStarted","Data":"b4d4b7d734bccf95375be9f82149ee42f25fb17785a79c1039333855c067059b"} Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.089838 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerStarted","Data":"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153"} Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.090182 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerStarted","Data":"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836"} Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.124322 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.124301995 podStartE2EDuration="2.124301995s" podCreationTimestamp="2025-10-06 09:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:08.114925989 +0000 UTC m=+1319.829979457" watchObservedRunningTime="2025-10-06 09:03:08.124301995 +0000 UTC m=+1319.839355393" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.362483 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.363692 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.430702 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.648815 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.903244 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:03:08 crc kubenswrapper[4610]: I1006 09:03:08.903553 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.095431 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.104844 4610 generic.go:334] "Generic (PLEG): container finished" podID="d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" containerID="05f9f58f875be83673aae401d57ca88b0deefa5852be2171880ae839549e78d1" exitCode=0 Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.104929 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m9pmg" event={"ID":"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a","Type":"ContainerDied","Data":"05f9f58f875be83673aae401d57ca88b0deefa5852be2171880ae839549e78d1"} Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.174715 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.186154 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.278544 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.278958 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="dnsmasq-dns" containerID="cri-o://fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163" gracePeriod=10 Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.869016 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.920290 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqh5s\" (UniqueName: \"kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.920371 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.920425 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.920463 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.920583 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.921174 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0\") pod \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\" (UID: \"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9\") " Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.938210 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s" (OuterVolumeSpecName: "kube-api-access-lqh5s") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "kube-api-access-lqh5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.986263 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:09 crc kubenswrapper[4610]: I1006 09:03:09.986575 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.023331 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqh5s\" (UniqueName: \"kubernetes.io/projected/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-kube-api-access-lqh5s\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.039811 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.040685 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.067638 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.068264 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.103361 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config" (OuterVolumeSpecName: "config") pod "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" (UID: "ccc29f79-1401-4337-87c6-ef7c9d2ec7f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.118873 4610 generic.go:334] "Generic (PLEG): container finished" podID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerID="fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163" exitCode=0 Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.119222 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.119448 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" event={"ID":"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9","Type":"ContainerDied","Data":"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163"} Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.119480 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-k4x5f" event={"ID":"ccc29f79-1401-4337-87c6-ef7c9d2ec7f9","Type":"ContainerDied","Data":"25e25665788695f95d2b8fc6c73d6a29d02b99fec812431c8b79d86a3e152154"} Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.119496 4610 scope.go:117] "RemoveContainer" containerID="fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.125514 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.125542 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.125551 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.125560 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.125568 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.159262 4610 scope.go:117] "RemoveContainer" containerID="4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.179908 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.195376 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-k4x5f"] Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.214337 4610 scope.go:117] "RemoveContainer" containerID="fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163" Oct 06 09:03:10 crc kubenswrapper[4610]: E1006 09:03:10.220195 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163\": container with ID starting with fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163 not found: ID does not exist" containerID="fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.220234 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163"} err="failed to get container status \"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163\": rpc error: code = NotFound desc = could not find container \"fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163\": container with ID starting with fdd70107d2770a297c5f15d11cd036f7fee484fef2fb18819e1e827e06478163 not found: ID does not exist" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.220262 4610 scope.go:117] "RemoveContainer" containerID="4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc" Oct 06 09:03:10 crc kubenswrapper[4610]: E1006 09:03:10.227928 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc\": container with ID starting with 4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc not found: ID does not exist" containerID="4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.227967 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc"} err="failed to get container status \"4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc\": rpc error: code = NotFound desc = could not find container \"4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc\": container with ID starting with 4e8b765634e4b9624972a8f3d5dd2d48e5bc0f231d617918a00a47ba2a4a88bc not found: ID does not exist" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.564185 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.639822 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data\") pod \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.639921 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwjt\" (UniqueName: \"kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt\") pod \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.640060 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts\") pod \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.640093 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle\") pod \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\" (UID: \"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a\") " Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.660708 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts" (OuterVolumeSpecName: "scripts") pod "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" (UID: "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.664177 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt" (OuterVolumeSpecName: "kube-api-access-zqwjt") pod "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" (UID: "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a"). InnerVolumeSpecName "kube-api-access-zqwjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.681176 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" (UID: "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.689301 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data" (OuterVolumeSpecName: "config-data") pod "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" (UID: "d7411bbf-edb9-450b-ae3b-02ccaa0dd04a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.742665 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.742699 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwjt\" (UniqueName: \"kubernetes.io/projected/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-kube-api-access-zqwjt\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.742712 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:10 crc kubenswrapper[4610]: I1006 09:03:10.742720 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.099033 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" path="/var/lib/kubelet/pods/ccc29f79-1401-4337-87c6-ef7c9d2ec7f9/volumes" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.147774 4610 generic.go:334] "Generic (PLEG): container finished" podID="f6ed6294-5577-470e-8571-199cc7cc777d" containerID="ca5fdc62c020cdfbae7862d83298cb2627b8031bbecc60a7d4942edf1bab1b9a" exitCode=0 Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.147824 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vflrs" event={"ID":"f6ed6294-5577-470e-8571-199cc7cc777d","Type":"ContainerDied","Data":"ca5fdc62c020cdfbae7862d83298cb2627b8031bbecc60a7d4942edf1bab1b9a"} Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.172395 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m9pmg" event={"ID":"d7411bbf-edb9-450b-ae3b-02ccaa0dd04a","Type":"ContainerDied","Data":"1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9"} Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.172455 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d281c727e3adc8ade8c99af90e186353ad0e6d16c7229f6d065d0c5f4b6bab9" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.172528 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m9pmg" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.327386 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.340641 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.340862 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-log" containerID="cri-o://1af331a1cbacd4d93576dcd593c82a0f47aeb5008c181d6d0ec5b2aa350024ce" gracePeriod=30 Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.340966 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-api" containerID="cri-o://2949f39bb9c0ccd2c3f3e629246725f6f64c20c8708e9c50698e0e21b9f4780e" gracePeriod=30 Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.371310 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.371516 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-log" containerID="cri-o://edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" gracePeriod=30 Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.371645 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-metadata" containerID="cri-o://3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" gracePeriod=30 Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.466607 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.466691 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:03:11 crc kubenswrapper[4610]: I1006 09:03:11.963228 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064255 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data\") pod \"485c102e-5c6c-46d6-a459-564d894991d4\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064285 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjxn\" (UniqueName: \"kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn\") pod \"485c102e-5c6c-46d6-a459-564d894991d4\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064373 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs\") pod \"485c102e-5c6c-46d6-a459-564d894991d4\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064399 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs\") pod \"485c102e-5c6c-46d6-a459-564d894991d4\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064415 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle\") pod \"485c102e-5c6c-46d6-a459-564d894991d4\" (UID: \"485c102e-5c6c-46d6-a459-564d894991d4\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.064903 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs" (OuterVolumeSpecName: "logs") pod "485c102e-5c6c-46d6-a459-564d894991d4" (UID: "485c102e-5c6c-46d6-a459-564d894991d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.075422 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn" (OuterVolumeSpecName: "kube-api-access-4mjxn") pod "485c102e-5c6c-46d6-a459-564d894991d4" (UID: "485c102e-5c6c-46d6-a459-564d894991d4"). InnerVolumeSpecName "kube-api-access-4mjxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.098140 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485c102e-5c6c-46d6-a459-564d894991d4" (UID: "485c102e-5c6c-46d6-a459-564d894991d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.111226 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data" (OuterVolumeSpecName: "config-data") pod "485c102e-5c6c-46d6-a459-564d894991d4" (UID: "485c102e-5c6c-46d6-a459-564d894991d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.122129 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "485c102e-5c6c-46d6-a459-564d894991d4" (UID: "485c102e-5c6c-46d6-a459-564d894991d4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.165629 4610 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.165663 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c102e-5c6c-46d6-a459-564d894991d4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.165672 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.165681 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c102e-5c6c-46d6-a459-564d894991d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.165691 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mjxn\" (UniqueName: \"kubernetes.io/projected/485c102e-5c6c-46d6-a459-564d894991d4-kube-api-access-4mjxn\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193528 4610 generic.go:334] "Generic (PLEG): container finished" podID="485c102e-5c6c-46d6-a459-564d894991d4" containerID="3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" exitCode=0 Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193565 4610 generic.go:334] "Generic (PLEG): container finished" podID="485c102e-5c6c-46d6-a459-564d894991d4" containerID="edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" exitCode=143 Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193590 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193585 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerDied","Data":"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153"} Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193679 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerDied","Data":"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836"} Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193700 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"485c102e-5c6c-46d6-a459-564d894991d4","Type":"ContainerDied","Data":"b4d4b7d734bccf95375be9f82149ee42f25fb17785a79c1039333855c067059b"} Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.193742 4610 scope.go:117] "RemoveContainer" containerID="3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.196547 4610 generic.go:334] "Generic (PLEG): container finished" podID="601baab6-f9e5-4b9f-9938-0f857d936052" containerID="1af331a1cbacd4d93576dcd593c82a0f47aeb5008c181d6d0ec5b2aa350024ce" exitCode=143 Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.196693 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="067412b2-e953-438f-b53e-af51eba5313f" containerName="nova-scheduler-scheduler" containerID="cri-o://f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" gracePeriod=30 Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.197278 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerDied","Data":"1af331a1cbacd4d93576dcd593c82a0f47aeb5008c181d6d0ec5b2aa350024ce"} Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.225694 4610 scope.go:117] "RemoveContainer" containerID="edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.235102 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.248474 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.269918 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.270308 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-log" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270326 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-log" Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.270343 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="dnsmasq-dns" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270350 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="dnsmasq-dns" Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.270362 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-metadata" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270368 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-metadata" Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.270389 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" containerName="nova-manage" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270395 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" containerName="nova-manage" Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.270406 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="init" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270412 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="init" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270572 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-metadata" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270589 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c102e-5c6c-46d6-a459-564d894991d4" containerName="nova-metadata-log" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270613 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc29f79-1401-4337-87c6-ef7c9d2ec7f9" containerName="dnsmasq-dns" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.270622 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" containerName="nova-manage" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.272548 4610 scope.go:117] "RemoveContainer" containerID="3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.274407 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.279396 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153\": container with ID starting with 3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153 not found: ID does not exist" containerID="3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.279445 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153"} err="failed to get container status \"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153\": rpc error: code = NotFound desc = could not find container \"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153\": container with ID starting with 3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153 not found: ID does not exist" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.279471 4610 scope.go:117] "RemoveContainer" containerID="edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.279754 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.280635 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.282234 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:12 crc kubenswrapper[4610]: E1006 09:03:12.284496 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836\": container with ID starting with edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836 not found: ID does not exist" containerID="edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.284526 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836"} err="failed to get container status \"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836\": rpc error: code = NotFound desc = could not find container \"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836\": container with ID starting with edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836 not found: ID does not exist" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.284548 4610 scope.go:117] "RemoveContainer" containerID="3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.285124 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153"} err="failed to get container status \"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153\": rpc error: code = NotFound desc = could not find container \"3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153\": container with ID starting with 3f126693c584aba67eec901094b745b170a82227325bcb783d66388c0f05d153 not found: ID does not exist" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.285168 4610 scope.go:117] "RemoveContainer" containerID="edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.288094 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836"} err="failed to get container status \"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836\": rpc error: code = NotFound desc = could not find container \"edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836\": container with ID starting with edcad6c05d1b3a9bc7f01492ffbfd34339338e6f9e5894f438289f5930d12836 not found: ID does not exist" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.473377 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.473460 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.473669 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.473799 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4j28\" (UniqueName: \"kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.473884 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576168 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576233 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4j28\" (UniqueName: \"kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576346 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576374 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.576738 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.583015 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.586270 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.587519 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.600711 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4j28\" (UniqueName: \"kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28\") pod \"nova-metadata-0\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " pod="openstack/nova-metadata-0" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.664377 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.779180 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpc8x\" (UniqueName: \"kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x\") pod \"f6ed6294-5577-470e-8571-199cc7cc777d\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.779320 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data\") pod \"f6ed6294-5577-470e-8571-199cc7cc777d\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.779359 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle\") pod \"f6ed6294-5577-470e-8571-199cc7cc777d\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.779481 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts\") pod \"f6ed6294-5577-470e-8571-199cc7cc777d\" (UID: \"f6ed6294-5577-470e-8571-199cc7cc777d\") " Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.782322 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts" (OuterVolumeSpecName: "scripts") pod "f6ed6294-5577-470e-8571-199cc7cc777d" (UID: "f6ed6294-5577-470e-8571-199cc7cc777d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.793214 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x" (OuterVolumeSpecName: "kube-api-access-bpc8x") pod "f6ed6294-5577-470e-8571-199cc7cc777d" (UID: "f6ed6294-5577-470e-8571-199cc7cc777d"). InnerVolumeSpecName "kube-api-access-bpc8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.811701 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6ed6294-5577-470e-8571-199cc7cc777d" (UID: "f6ed6294-5577-470e-8571-199cc7cc777d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.833753 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data" (OuterVolumeSpecName: "config-data") pod "f6ed6294-5577-470e-8571-199cc7cc777d" (UID: "f6ed6294-5577-470e-8571-199cc7cc777d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.881929 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.881960 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.881970 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed6294-5577-470e-8571-199cc7cc777d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.881980 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpc8x\" (UniqueName: \"kubernetes.io/projected/f6ed6294-5577-470e-8571-199cc7cc777d-kube-api-access-bpc8x\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:12 crc kubenswrapper[4610]: I1006 09:03:12.899714 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.083183 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485c102e-5c6c-46d6-a459-564d894991d4" path="/var/lib/kubelet/pods/485c102e-5c6c-46d6-a459-564d894991d4/volumes" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.209415 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vflrs" event={"ID":"f6ed6294-5577-470e-8571-199cc7cc777d","Type":"ContainerDied","Data":"bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4"} Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.209449 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbdc466fee3b07b148e0604bd07f995851ce61617dff894ca859ab0b077e49a4" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.209497 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vflrs" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.251828 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 09:03:13 crc kubenswrapper[4610]: E1006 09:03:13.252245 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ed6294-5577-470e-8571-199cc7cc777d" containerName="nova-cell1-conductor-db-sync" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.252261 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ed6294-5577-470e-8571-199cc7cc777d" containerName="nova-cell1-conductor-db-sync" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.252432 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ed6294-5577-470e-8571-199cc7cc777d" containerName="nova-cell1-conductor-db-sync" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.252991 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.257777 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.266825 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.290591 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qc5t\" (UniqueName: \"kubernetes.io/projected/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-kube-api-access-5qc5t\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.290701 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.290747 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.362225 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:13 crc kubenswrapper[4610]: E1006 09:03:13.369860 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 09:03:13 crc kubenswrapper[4610]: E1006 09:03:13.373155 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 09:03:13 crc kubenswrapper[4610]: E1006 09:03:13.378566 4610 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 09:03:13 crc kubenswrapper[4610]: E1006 09:03:13.378632 4610 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="067412b2-e953-438f-b53e-af51eba5313f" containerName="nova-scheduler-scheduler" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.392575 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.392983 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qc5t\" (UniqueName: \"kubernetes.io/projected/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-kube-api-access-5qc5t\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.393124 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.397993 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.398075 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.412164 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qc5t\" (UniqueName: \"kubernetes.io/projected/f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992-kube-api-access-5qc5t\") pod \"nova-cell1-conductor-0\" (UID: \"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992\") " pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:13 crc kubenswrapper[4610]: I1006 09:03:13.568635 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.150189 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.226452 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992","Type":"ContainerStarted","Data":"aaff0f18fca330420c1065d8fb6d733a8370ea8f857a93020bc58fac6eeec8a6"} Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.228655 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerStarted","Data":"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1"} Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.228693 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerStarted","Data":"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063"} Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.228710 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerStarted","Data":"9358d8f80c1caa5088e21d7f907fa441b9a20f9c2b09f8e268338a024ff067b8"} Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.261092 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.261068168 podStartE2EDuration="2.261068168s" podCreationTimestamp="2025-10-06 09:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:14.252767519 +0000 UTC m=+1325.967820907" watchObservedRunningTime="2025-10-06 09:03:14.261068168 +0000 UTC m=+1325.976121556" Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.874564 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:14 crc kubenswrapper[4610]: I1006 09:03:14.875363 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cf575405-4778-47c1-b0c1-b1a51c9936d1" containerName="kube-state-metrics" containerID="cri-o://d4f73b9bbf53dbd41bf1dde3d799a21f990a36bb7fd5b5be1259a74dc8c1f380" gracePeriod=30 Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.242252 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992","Type":"ContainerStarted","Data":"3a94ec9cc85612ee447abc76af0df4456be2682de741fad1ba69253eab7f1eef"} Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.243489 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.254699 4610 generic.go:334] "Generic (PLEG): container finished" podID="cf575405-4778-47c1-b0c1-b1a51c9936d1" containerID="d4f73b9bbf53dbd41bf1dde3d799a21f990a36bb7fd5b5be1259a74dc8c1f380" exitCode=2 Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.256722 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cf575405-4778-47c1-b0c1-b1a51c9936d1","Type":"ContainerDied","Data":"d4f73b9bbf53dbd41bf1dde3d799a21f990a36bb7fd5b5be1259a74dc8c1f380"} Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.276562 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.276543053 podStartE2EDuration="2.276543053s" podCreationTimestamp="2025-10-06 09:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:15.269959868 +0000 UTC m=+1326.985013256" watchObservedRunningTime="2025-10-06 09:03:15.276543053 +0000 UTC m=+1326.991596441" Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.441708 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.633256 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22smf\" (UniqueName: \"kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf\") pod \"cf575405-4778-47c1-b0c1-b1a51c9936d1\" (UID: \"cf575405-4778-47c1-b0c1-b1a51c9936d1\") " Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.645262 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf" (OuterVolumeSpecName: "kube-api-access-22smf") pod "cf575405-4778-47c1-b0c1-b1a51c9936d1" (UID: "cf575405-4778-47c1-b0c1-b1a51c9936d1"). InnerVolumeSpecName "kube-api-access-22smf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:15 crc kubenswrapper[4610]: I1006 09:03:15.735024 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22smf\" (UniqueName: \"kubernetes.io/projected/cf575405-4778-47c1-b0c1-b1a51c9936d1-kube-api-access-22smf\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.155962 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.249852 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84qgk\" (UniqueName: \"kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk\") pod \"067412b2-e953-438f-b53e-af51eba5313f\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.250169 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data\") pod \"067412b2-e953-438f-b53e-af51eba5313f\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.250206 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle\") pod \"067412b2-e953-438f-b53e-af51eba5313f\" (UID: \"067412b2-e953-438f-b53e-af51eba5313f\") " Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.254282 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk" (OuterVolumeSpecName: "kube-api-access-84qgk") pod "067412b2-e953-438f-b53e-af51eba5313f" (UID: "067412b2-e953-438f-b53e-af51eba5313f"). InnerVolumeSpecName "kube-api-access-84qgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.267503 4610 generic.go:334] "Generic (PLEG): container finished" podID="067412b2-e953-438f-b53e-af51eba5313f" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" exitCode=0 Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.267647 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067412b2-e953-438f-b53e-af51eba5313f","Type":"ContainerDied","Data":"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e"} Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.267672 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"067412b2-e953-438f-b53e-af51eba5313f","Type":"ContainerDied","Data":"3efd7f5ef283a674bb45e56177635b749fef3402c5bf752309318344b61cdf00"} Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.267688 4610 scope.go:117] "RemoveContainer" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.267786 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.274565 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.275108 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cf575405-4778-47c1-b0c1-b1a51c9936d1","Type":"ContainerDied","Data":"873b27a304dab5994e9a265e53cb068755436e385657fc7519caa409daf8186c"} Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.282232 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data" (OuterVolumeSpecName: "config-data") pod "067412b2-e953-438f-b53e-af51eba5313f" (UID: "067412b2-e953-438f-b53e-af51eba5313f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.298320 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "067412b2-e953-438f-b53e-af51eba5313f" (UID: "067412b2-e953-438f-b53e-af51eba5313f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.307614 4610 scope.go:117] "RemoveContainer" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" Oct 06 09:03:16 crc kubenswrapper[4610]: E1006 09:03:16.312776 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e\": container with ID starting with f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e not found: ID does not exist" containerID="f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.312827 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e"} err="failed to get container status \"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e\": rpc error: code = NotFound desc = could not find container \"f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e\": container with ID starting with f830533f110441fbfda74a5b0f02e5c6bcd3e09a5ceb02bdb98f91bb8173954e not found: ID does not exist" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.312849 4610 scope.go:117] "RemoveContainer" containerID="d4f73b9bbf53dbd41bf1dde3d799a21f990a36bb7fd5b5be1259a74dc8c1f380" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.317939 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.330180 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.335020 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: E1006 09:03:16.339543 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067412b2-e953-438f-b53e-af51eba5313f" containerName="nova-scheduler-scheduler" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.339566 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="067412b2-e953-438f-b53e-af51eba5313f" containerName="nova-scheduler-scheduler" Oct 06 09:03:16 crc kubenswrapper[4610]: E1006 09:03:16.339595 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf575405-4778-47c1-b0c1-b1a51c9936d1" containerName="kube-state-metrics" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.339602 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf575405-4778-47c1-b0c1-b1a51c9936d1" containerName="kube-state-metrics" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.339758 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf575405-4778-47c1-b0c1-b1a51c9936d1" containerName="kube-state-metrics" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.339778 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="067412b2-e953-438f-b53e-af51eba5313f" containerName="nova-scheduler-scheduler" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.352707 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.354670 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.365484 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.365697 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.403637 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84qgk\" (UniqueName: \"kubernetes.io/projected/067412b2-e953-438f-b53e-af51eba5313f-kube-api-access-84qgk\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.405107 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.405160 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067412b2-e953-438f-b53e-af51eba5313f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.468777 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.469022 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.506806 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.506874 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbztd\" (UniqueName: \"kubernetes.io/projected/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-api-access-jbztd\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.506904 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.506927 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.607319 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.608893 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbztd\" (UniqueName: \"kubernetes.io/projected/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-api-access-jbztd\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.609120 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.609265 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.609730 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.616987 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.621388 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.623093 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.626692 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.633771 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.635033 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.637063 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbztd\" (UniqueName: \"kubernetes.io/projected/cdd44fea-d46e-45e1-be47-89cc8a1f63c7-kube-api-access-jbztd\") pod \"kube-state-metrics-0\" (UID: \"cdd44fea-d46e-45e1-be47-89cc8a1f63c7\") " pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.637080 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.641055 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.688575 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.814447 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vf6q\" (UniqueName: \"kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.814550 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.814596 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.919172 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.919229 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.919330 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vf6q\" (UniqueName: \"kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.934586 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.952607 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.954366 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.954650 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-central-agent" containerID="cri-o://e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" gracePeriod=30 Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.955776 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="proxy-httpd" containerID="cri-o://c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" gracePeriod=30 Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.955873 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="sg-core" containerID="cri-o://7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" gracePeriod=30 Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.957269 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-notification-agent" containerID="cri-o://6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" gracePeriod=30 Oct 06 09:03:16 crc kubenswrapper[4610]: I1006 09:03:16.964803 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vf6q\" (UniqueName: \"kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q\") pod \"nova-scheduler-0\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.004484 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.085180 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067412b2-e953-438f-b53e-af51eba5313f" path="/var/lib/kubelet/pods/067412b2-e953-438f-b53e-af51eba5313f/volumes" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.086257 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf575405-4778-47c1-b0c1-b1a51c9936d1" path="/var/lib/kubelet/pods/cf575405-4778-47c1-b0c1-b1a51c9936d1/volumes" Oct 06 09:03:17 crc kubenswrapper[4610]: W1006 09:03:17.294793 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdd44fea_d46e_45e1_be47_89cc8a1f63c7.slice/crio-7b5e84a9a6ecacb0fc82f698ac1d33338e0a798b5c48a4f39037b62dd9bdff19 WatchSource:0}: Error finding container 7b5e84a9a6ecacb0fc82f698ac1d33338e0a798b5c48a4f39037b62dd9bdff19: Status 404 returned error can't find the container with id 7b5e84a9a6ecacb0fc82f698ac1d33338e0a798b5c48a4f39037b62dd9bdff19 Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.296622 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.298212 4610 generic.go:334] "Generic (PLEG): container finished" podID="13761fa8-a633-4047-a034-12077c20d9f0" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" exitCode=2 Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.298267 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerDied","Data":"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1"} Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.304324 4610 generic.go:334] "Generic (PLEG): container finished" podID="601baab6-f9e5-4b9f-9938-0f857d936052" containerID="2949f39bb9c0ccd2c3f3e629246725f6f64c20c8708e9c50698e0e21b9f4780e" exitCode=0 Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.304375 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerDied","Data":"2949f39bb9c0ccd2c3f3e629246725f6f64c20c8708e9c50698e0e21b9f4780e"} Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.320675 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.429968 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle\") pod \"601baab6-f9e5-4b9f-9938-0f857d936052\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.430072 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs\") pod \"601baab6-f9e5-4b9f-9938-0f857d936052\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.430195 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svkrw\" (UniqueName: \"kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw\") pod \"601baab6-f9e5-4b9f-9938-0f857d936052\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.430530 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs" (OuterVolumeSpecName: "logs") pod "601baab6-f9e5-4b9f-9938-0f857d936052" (UID: "601baab6-f9e5-4b9f-9938-0f857d936052"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.431279 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data\") pod \"601baab6-f9e5-4b9f-9938-0f857d936052\" (UID: \"601baab6-f9e5-4b9f-9938-0f857d936052\") " Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.432037 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601baab6-f9e5-4b9f-9938-0f857d936052-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.434546 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw" (OuterVolumeSpecName: "kube-api-access-svkrw") pod "601baab6-f9e5-4b9f-9938-0f857d936052" (UID: "601baab6-f9e5-4b9f-9938-0f857d936052"). InnerVolumeSpecName "kube-api-access-svkrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.457553 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data" (OuterVolumeSpecName: "config-data") pod "601baab6-f9e5-4b9f-9938-0f857d936052" (UID: "601baab6-f9e5-4b9f-9938-0f857d936052"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.458856 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "601baab6-f9e5-4b9f-9938-0f857d936052" (UID: "601baab6-f9e5-4b9f-9938-0f857d936052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.518602 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.533000 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svkrw\" (UniqueName: \"kubernetes.io/projected/601baab6-f9e5-4b9f-9938-0f857d936052-kube-api-access-svkrw\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.533027 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.533036 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601baab6-f9e5-4b9f-9938-0f857d936052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.900521 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.900843 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:03:17 crc kubenswrapper[4610]: I1006 09:03:17.944425 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.143426 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.143745 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144329 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144394 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144413 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144471 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhxz\" (UniqueName: \"kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144517 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144529 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd\") pod \"13761fa8-a633-4047-a034-12077c20d9f0\" (UID: \"13761fa8-a633-4047-a034-12077c20d9f0\") " Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.144766 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.145253 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.145279 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13761fa8-a633-4047-a034-12077c20d9f0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.149134 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts" (OuterVolumeSpecName: "scripts") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.154464 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz" (OuterVolumeSpecName: "kube-api-access-zqhxz") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "kube-api-access-zqhxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.208608 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.229555 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.247699 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.247730 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.247741 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.247749 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhxz\" (UniqueName: \"kubernetes.io/projected/13761fa8-a633-4047-a034-12077c20d9f0-kube-api-access-zqhxz\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.263955 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data" (OuterVolumeSpecName: "config-data") pod "13761fa8-a633-4047-a034-12077c20d9f0" (UID: "13761fa8-a633-4047-a034-12077c20d9f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.314836 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601baab6-f9e5-4b9f-9938-0f857d936052","Type":"ContainerDied","Data":"acd81ebc85619ec99681e3d01e40e773b549a23c47bb75bb2c3caea1bbfe5ad5"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.314885 4610 scope.go:117] "RemoveContainer" containerID="2949f39bb9c0ccd2c3f3e629246725f6f64c20c8708e9c50698e0e21b9f4780e" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.314977 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322487 4610 generic.go:334] "Generic (PLEG): container finished" podID="13761fa8-a633-4047-a034-12077c20d9f0" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" exitCode=0 Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322520 4610 generic.go:334] "Generic (PLEG): container finished" podID="13761fa8-a633-4047-a034-12077c20d9f0" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" exitCode=0 Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322531 4610 generic.go:334] "Generic (PLEG): container finished" podID="13761fa8-a633-4047-a034-12077c20d9f0" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" exitCode=0 Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322584 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerDied","Data":"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322617 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerDied","Data":"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322632 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerDied","Data":"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322644 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13761fa8-a633-4047-a034-12077c20d9f0","Type":"ContainerDied","Data":"19e5534c72d577602a57a72b1d52f0a0b0f420442e8dd6272ac5e4b087b974e2"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.322667 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.331302 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794a334b-382f-4fbc-b61f-11e5f9afb1c4","Type":"ContainerStarted","Data":"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.331349 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794a334b-382f-4fbc-b61f-11e5f9afb1c4","Type":"ContainerStarted","Data":"8925b2488dd015306323299cde0350295fb34facd65dcd0154e1cdad9afb0f93"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.334166 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cdd44fea-d46e-45e1-be47-89cc8a1f63c7","Type":"ContainerStarted","Data":"ccef5d79bd866aac8a6d61d108ac81c256735e51c1eae750b0f86f75526d96a5"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.334198 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cdd44fea-d46e-45e1-be47-89cc8a1f63c7","Type":"ContainerStarted","Data":"7b5e84a9a6ecacb0fc82f698ac1d33338e0a798b5c48a4f39037b62dd9bdff19"} Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.334940 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.353090 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13761fa8-a633-4047-a034-12077c20d9f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.356161 4610 scope.go:117] "RemoveContainer" containerID="1af331a1cbacd4d93576dcd593c82a0f47aeb5008c181d6d0ec5b2aa350024ce" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.361402 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.372679 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.377812 4610 scope.go:117] "RemoveContainer" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.404597 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.404961 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="proxy-httpd" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.404973 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="proxy-httpd" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.404983 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-central-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.404989 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-central-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.405024 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-log" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405031 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-log" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.405062 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-notification-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405070 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-notification-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.405081 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="sg-core" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405088 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="sg-core" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.405111 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-api" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405118 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-api" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405273 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="sg-core" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405293 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="proxy-httpd" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405306 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-central-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405317 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-log" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405331 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="13761fa8-a633-4047-a034-12077c20d9f0" containerName="ceilometer-notification-agent" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.405342 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" containerName="nova-api-api" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.408662 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.415978 4610 scope.go:117] "RemoveContainer" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.416323 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.416788 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.416768875 podStartE2EDuration="2.416768875s" podCreationTimestamp="2025-10-06 09:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:18.403982494 +0000 UTC m=+1330.119035882" watchObservedRunningTime="2025-10-06 09:03:18.416768875 +0000 UTC m=+1330.131822273" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.475065 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqpl\" (UniqueName: \"kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.475384 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.475521 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.475663 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.478587 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.482442 4610 scope.go:117] "RemoveContainer" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.501364 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.523720 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.527807 4610 scope.go:117] "RemoveContainer" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.539796 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.545399 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.546035 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.12442313 podStartE2EDuration="2.546015472s" podCreationTimestamp="2025-10-06 09:03:16 +0000 UTC" firstStartedPulling="2025-10-06 09:03:17.297698948 +0000 UTC m=+1329.012752336" lastFinishedPulling="2025-10-06 09:03:17.71929129 +0000 UTC m=+1329.434344678" observedRunningTime="2025-10-06 09:03:18.457505709 +0000 UTC m=+1330.172559097" watchObservedRunningTime="2025-10-06 09:03:18.546015472 +0000 UTC m=+1330.261068860" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.551978 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.552333 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.556512 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.581351 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.583365 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.586722 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqpl\" (UniqueName: \"kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.586960 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.587107 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.595211 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.608692 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.613439 4610 scope.go:117] "RemoveContainer" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.618514 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": container with ID starting with c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c not found: ID does not exist" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.618567 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c"} err="failed to get container status \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": rpc error: code = NotFound desc = could not find container \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": container with ID starting with c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.618612 4610 scope.go:117] "RemoveContainer" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.621489 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": container with ID starting with 7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1 not found: ID does not exist" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.621547 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1"} err="failed to get container status \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": rpc error: code = NotFound desc = could not find container \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": container with ID starting with 7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.621573 4610 scope.go:117] "RemoveContainer" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.622886 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqpl\" (UniqueName: \"kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl\") pod \"nova-api-0\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.623430 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": container with ID starting with 6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266 not found: ID does not exist" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.623468 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266"} err="failed to get container status \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": rpc error: code = NotFound desc = could not find container \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": container with ID starting with 6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.623483 4610 scope.go:117] "RemoveContainer" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" Oct 06 09:03:18 crc kubenswrapper[4610]: E1006 09:03:18.624125 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": container with ID starting with e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926 not found: ID does not exist" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.624151 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926"} err="failed to get container status \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": rpc error: code = NotFound desc = could not find container \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": container with ID starting with e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.624177 4610 scope.go:117] "RemoveContainer" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.624924 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c"} err="failed to get container status \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": rpc error: code = NotFound desc = could not find container \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": container with ID starting with c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.624943 4610 scope.go:117] "RemoveContainer" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625173 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1"} err="failed to get container status \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": rpc error: code = NotFound desc = could not find container \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": container with ID starting with 7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625194 4610 scope.go:117] "RemoveContainer" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625395 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266"} err="failed to get container status \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": rpc error: code = NotFound desc = could not find container \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": container with ID starting with 6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625415 4610 scope.go:117] "RemoveContainer" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625578 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926"} err="failed to get container status \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": rpc error: code = NotFound desc = could not find container \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": container with ID starting with e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625592 4610 scope.go:117] "RemoveContainer" containerID="c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625749 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c"} err="failed to get container status \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": rpc error: code = NotFound desc = could not find container \"c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c\": container with ID starting with c1a67a0f7ad680a65e922e4459bf021cc248f634fd4321783eff940679eacc3c not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625765 4610 scope.go:117] "RemoveContainer" containerID="7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625915 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1"} err="failed to get container status \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": rpc error: code = NotFound desc = could not find container \"7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1\": container with ID starting with 7fac6c3d4edbf008e0c49a3d54bd89532b4881ff4881d9d831fd570a650284d1 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.625933 4610 scope.go:117] "RemoveContainer" containerID="6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.626165 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266"} err="failed to get container status \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": rpc error: code = NotFound desc = could not find container \"6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266\": container with ID starting with 6350c7590c7c8c5fa9873b02b2e4f631d93219e951c99d045ba0df1af6bf8266 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.626186 4610 scope.go:117] "RemoveContainer" containerID="e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.626349 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926"} err="failed to get container status \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": rpc error: code = NotFound desc = could not find container \"e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926\": container with ID starting with e02cbb000116804913ac6745d19e8801cedcd08ff7df23355fd4fffd0016a926 not found: ID does not exist" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.640152 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689335 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689424 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6tj\" (UniqueName: \"kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689469 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689533 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689555 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689586 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689605 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.689680 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.742315 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791292 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791349 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791393 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791418 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791503 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791543 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791610 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6tj\" (UniqueName: \"kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.791659 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.796158 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.797259 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.797618 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.806564 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.812480 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.816591 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.816669 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.821923 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6tj\" (UniqueName: \"kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj\") pod \"ceilometer-0\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " pod="openstack/ceilometer-0" Oct 06 09:03:18 crc kubenswrapper[4610]: I1006 09:03:18.912479 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:19 crc kubenswrapper[4610]: I1006 09:03:19.082823 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13761fa8-a633-4047-a034-12077c20d9f0" path="/var/lib/kubelet/pods/13761fa8-a633-4047-a034-12077c20d9f0/volumes" Oct 06 09:03:19 crc kubenswrapper[4610]: I1006 09:03:19.084120 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601baab6-f9e5-4b9f-9938-0f857d936052" path="/var/lib/kubelet/pods/601baab6-f9e5-4b9f-9938-0f857d936052/volumes" Oct 06 09:03:19 crc kubenswrapper[4610]: W1006 09:03:19.252714 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54e0d9b_8e85_4be5_a5f2_b709028c085f.slice/crio-dd0b8c14ebe25ec2f3d19fb08c071c436194e748456bcbbc1f1259d4c4f75731 WatchSource:0}: Error finding container dd0b8c14ebe25ec2f3d19fb08c071c436194e748456bcbbc1f1259d4c4f75731: Status 404 returned error can't find the container with id dd0b8c14ebe25ec2f3d19fb08c071c436194e748456bcbbc1f1259d4c4f75731 Oct 06 09:03:19 crc kubenswrapper[4610]: I1006 09:03:19.253256 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:19 crc kubenswrapper[4610]: I1006 09:03:19.345303 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerStarted","Data":"dd0b8c14ebe25ec2f3d19fb08c071c436194e748456bcbbc1f1259d4c4f75731"} Oct 06 09:03:19 crc kubenswrapper[4610]: I1006 09:03:19.409727 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:19 crc kubenswrapper[4610]: W1006 09:03:19.417681 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d58e6c_0e5d_443a_af0a_595461252d4f.slice/crio-5eb759b364d8463b1001a113bad1ea2d0fa704e9268b9cbc3580a0d8a5e658cb WatchSource:0}: Error finding container 5eb759b364d8463b1001a113bad1ea2d0fa704e9268b9cbc3580a0d8a5e658cb: Status 404 returned error can't find the container with id 5eb759b364d8463b1001a113bad1ea2d0fa704e9268b9cbc3580a0d8a5e658cb Oct 06 09:03:20 crc kubenswrapper[4610]: I1006 09:03:20.363996 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerStarted","Data":"d8d815815563192b338bf7a52350e5c32a830098c8c33ed51dce5da3ec1ce104"} Oct 06 09:03:20 crc kubenswrapper[4610]: I1006 09:03:20.364561 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerStarted","Data":"2708de13c63f34b11f36d37cc0f621ef5cea534d9fe2a7e313116cff41296acf"} Oct 06 09:03:20 crc kubenswrapper[4610]: I1006 09:03:20.366988 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerStarted","Data":"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40"} Oct 06 09:03:20 crc kubenswrapper[4610]: I1006 09:03:20.367024 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerStarted","Data":"5eb759b364d8463b1001a113bad1ea2d0fa704e9268b9cbc3580a0d8a5e658cb"} Oct 06 09:03:21 crc kubenswrapper[4610]: I1006 09:03:21.379423 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerStarted","Data":"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab"} Oct 06 09:03:22 crc kubenswrapper[4610]: I1006 09:03:22.005679 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 09:03:22 crc kubenswrapper[4610]: I1006 09:03:22.397830 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerStarted","Data":"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7"} Oct 06 09:03:22 crc kubenswrapper[4610]: I1006 09:03:22.901462 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 09:03:22 crc kubenswrapper[4610]: I1006 09:03:22.901516 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.410547 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerStarted","Data":"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4"} Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.410846 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.430172 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.430156473 podStartE2EDuration="5.430156473s" podCreationTimestamp="2025-10-06 09:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:20.386437456 +0000 UTC m=+1332.101491054" watchObservedRunningTime="2025-10-06 09:03:23.430156473 +0000 UTC m=+1335.145209861" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.436648 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.961667803 podStartE2EDuration="5.436639406s" podCreationTimestamp="2025-10-06 09:03:18 +0000 UTC" firstStartedPulling="2025-10-06 09:03:19.420495175 +0000 UTC m=+1331.135548563" lastFinishedPulling="2025-10-06 09:03:22.895466768 +0000 UTC m=+1334.610520166" observedRunningTime="2025-10-06 09:03:23.428672286 +0000 UTC m=+1335.143725674" watchObservedRunningTime="2025-10-06 09:03:23.436639406 +0000 UTC m=+1335.151692794" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.603887 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.917173 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:23 crc kubenswrapper[4610]: I1006 09:03:23.917180 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:26 crc kubenswrapper[4610]: I1006 09:03:26.702076 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 09:03:27 crc kubenswrapper[4610]: I1006 09:03:27.005557 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 09:03:27 crc kubenswrapper[4610]: I1006 09:03:27.042894 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 09:03:27 crc kubenswrapper[4610]: I1006 09:03:27.520195 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 09:03:28 crc kubenswrapper[4610]: I1006 09:03:28.742796 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:03:28 crc kubenswrapper[4610]: I1006 09:03:28.742854 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:03:29 crc kubenswrapper[4610]: I1006 09:03:29.825196 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:29 crc kubenswrapper[4610]: I1006 09:03:29.825270 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 09:03:32 crc kubenswrapper[4610]: I1006 09:03:32.906131 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 09:03:32 crc kubenswrapper[4610]: I1006 09:03:32.908342 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 09:03:32 crc kubenswrapper[4610]: I1006 09:03:32.909772 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 09:03:33 crc kubenswrapper[4610]: I1006 09:03:33.544475 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.400426 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.511099 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle\") pod \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.511795 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvn4\" (UniqueName: \"kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4\") pod \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.512242 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data\") pod \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\" (UID: \"758e6b3d-66a1-488f-a861-5ee5ab3a7b56\") " Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.517835 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4" (OuterVolumeSpecName: "kube-api-access-ptvn4") pod "758e6b3d-66a1-488f-a861-5ee5ab3a7b56" (UID: "758e6b3d-66a1-488f-a861-5ee5ab3a7b56"). InnerVolumeSpecName "kube-api-access-ptvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.544844 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data" (OuterVolumeSpecName: "config-data") pod "758e6b3d-66a1-488f-a861-5ee5ab3a7b56" (UID: "758e6b3d-66a1-488f-a861-5ee5ab3a7b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.547323 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "758e6b3d-66a1-488f-a861-5ee5ab3a7b56" (UID: "758e6b3d-66a1-488f-a861-5ee5ab3a7b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.553369 4610 generic.go:334] "Generic (PLEG): container finished" podID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" containerID="2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3" exitCode=137 Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.553579 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"758e6b3d-66a1-488f-a861-5ee5ab3a7b56","Type":"ContainerDied","Data":"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3"} Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.553743 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"758e6b3d-66a1-488f-a861-5ee5ab3a7b56","Type":"ContainerDied","Data":"f07673e0d62167090a2ea0726b12839f440a2063e72afce24eadeb372bc6e8b6"} Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.553821 4610 scope.go:117] "RemoveContainer" containerID="2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.554893 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.615400 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvn4\" (UniqueName: \"kubernetes.io/projected/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-kube-api-access-ptvn4\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.615683 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.615807 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e6b3d-66a1-488f-a861-5ee5ab3a7b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.624118 4610 scope.go:117] "RemoveContainer" containerID="2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3" Oct 06 09:03:34 crc kubenswrapper[4610]: E1006 09:03:34.624809 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3\": container with ID starting with 2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3 not found: ID does not exist" containerID="2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.624850 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3"} err="failed to get container status \"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3\": rpc error: code = NotFound desc = could not find container \"2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3\": container with ID starting with 2d1da1862a9486ffca17129ac5e1ccfba1a24d2cba8dbad07b1794c935de19c3 not found: ID does not exist" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.646912 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.676102 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.684978 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:34 crc kubenswrapper[4610]: E1006 09:03:34.685386 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.685402 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.685591 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.686268 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.705069 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.706811 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.706931 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.708985 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.820897 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.820976 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbqgp\" (UniqueName: \"kubernetes.io/projected/760824cd-931b-4588-85d0-8b0548fc8c38-kube-api-access-vbqgp\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.820998 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.821038 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.821097 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.922406 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.922828 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbqgp\" (UniqueName: \"kubernetes.io/projected/760824cd-931b-4588-85d0-8b0548fc8c38-kube-api-access-vbqgp\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.922940 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.923109 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.923268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.926432 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.926623 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.927019 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.927426 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760824cd-931b-4588-85d0-8b0548fc8c38-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:34 crc kubenswrapper[4610]: I1006 09:03:34.939797 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbqgp\" (UniqueName: \"kubernetes.io/projected/760824cd-931b-4588-85d0-8b0548fc8c38-kube-api-access-vbqgp\") pod \"nova-cell1-novncproxy-0\" (UID: \"760824cd-931b-4588-85d0-8b0548fc8c38\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:35 crc kubenswrapper[4610]: I1006 09:03:35.020708 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:35 crc kubenswrapper[4610]: I1006 09:03:35.090651 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758e6b3d-66a1-488f-a861-5ee5ab3a7b56" path="/var/lib/kubelet/pods/758e6b3d-66a1-488f-a861-5ee5ab3a7b56/volumes" Oct 06 09:03:35 crc kubenswrapper[4610]: I1006 09:03:35.591311 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 09:03:35 crc kubenswrapper[4610]: W1006 09:03:35.601118 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod760824cd_931b_4588_85d0_8b0548fc8c38.slice/crio-b9f09dc545c67ede3c05013eb0f388a3dd84bbce739f2c0827eaff474b7966b6 WatchSource:0}: Error finding container b9f09dc545c67ede3c05013eb0f388a3dd84bbce739f2c0827eaff474b7966b6: Status 404 returned error can't find the container with id b9f09dc545c67ede3c05013eb0f388a3dd84bbce739f2c0827eaff474b7966b6 Oct 06 09:03:36 crc kubenswrapper[4610]: I1006 09:03:36.574182 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"760824cd-931b-4588-85d0-8b0548fc8c38","Type":"ContainerStarted","Data":"72faa37b11b54df04a43e7b73bed8b6f505ac5a986bc00f317c5310e835ed7b3"} Oct 06 09:03:36 crc kubenswrapper[4610]: I1006 09:03:36.574701 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"760824cd-931b-4588-85d0-8b0548fc8c38","Type":"ContainerStarted","Data":"b9f09dc545c67ede3c05013eb0f388a3dd84bbce739f2c0827eaff474b7966b6"} Oct 06 09:03:36 crc kubenswrapper[4610]: I1006 09:03:36.590612 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.590588955 podStartE2EDuration="2.590588955s" podCreationTimestamp="2025-10-06 09:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:36.587375985 +0000 UTC m=+1348.302429413" watchObservedRunningTime="2025-10-06 09:03:36.590588955 +0000 UTC m=+1348.305642343" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.747165 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.748383 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.749157 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.749261 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.752988 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.754088 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.972145 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.973686 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:38 crc kubenswrapper[4610]: I1006 09:03:38.994645 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106281 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6fw\" (UniqueName: \"kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106379 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106401 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106628 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106691 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.106710 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208375 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6fw\" (UniqueName: \"kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208508 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208572 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208602 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.208637 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.209867 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.210011 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.210167 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.210251 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.210609 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.233137 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6fw\" (UniqueName: \"kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw\") pod \"dnsmasq-dns-5c7b6c5df9-nm5v7\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.293062 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:39 crc kubenswrapper[4610]: I1006 09:03:39.766276 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:03:40 crc kubenswrapper[4610]: I1006 09:03:40.021367 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:40 crc kubenswrapper[4610]: I1006 09:03:40.611903 4610 generic.go:334] "Generic (PLEG): container finished" podID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerID="e32daa257ea131795761e0658bb7dd9455ca5381cd00110a5200f7b175d9528f" exitCode=0 Oct 06 09:03:40 crc kubenswrapper[4610]: I1006 09:03:40.612477 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" event={"ID":"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94","Type":"ContainerDied","Data":"e32daa257ea131795761e0658bb7dd9455ca5381cd00110a5200f7b175d9528f"} Oct 06 09:03:40 crc kubenswrapper[4610]: I1006 09:03:40.612504 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" event={"ID":"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94","Type":"ContainerStarted","Data":"b0b761c7b51302102a18ced5c7900074217e38783e89ad0010ba5a726f1efcca"} Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.317153 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.319183 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-central-agent" containerID="cri-o://275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.319891 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="sg-core" containerID="cri-o://7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.319900 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="proxy-httpd" containerID="cri-o://9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.320797 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-notification-agent" containerID="cri-o://74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.343309 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": EOF" Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.634102 4610 generic.go:334] "Generic (PLEG): container finished" podID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerID="9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4" exitCode=0 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.634628 4610 generic.go:334] "Generic (PLEG): container finished" podID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerID="7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7" exitCode=2 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.634151 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerDied","Data":"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4"} Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.634769 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerDied","Data":"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7"} Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.639113 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" event={"ID":"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94","Type":"ContainerStarted","Data":"5dfa7208bf947a680ecb804f43446466bbb58ce3891c43cb41255957abb288a0"} Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.639296 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.666179 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.666753 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-api" containerID="cri-o://d8d815815563192b338bf7a52350e5c32a830098c8c33ed51dce5da3ec1ce104" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.666756 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-log" containerID="cri-o://2708de13c63f34b11f36d37cc0f621ef5cea534d9fe2a7e313116cff41296acf" gracePeriod=30 Oct 06 09:03:41 crc kubenswrapper[4610]: I1006 09:03:41.668941 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" podStartSLOduration=3.668922995 podStartE2EDuration="3.668922995s" podCreationTimestamp="2025-10-06 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:41.656018511 +0000 UTC m=+1353.371071919" watchObservedRunningTime="2025-10-06 09:03:41.668922995 +0000 UTC m=+1353.383976413" Oct 06 09:03:42 crc kubenswrapper[4610]: I1006 09:03:42.651784 4610 generic.go:334] "Generic (PLEG): container finished" podID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerID="275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40" exitCode=0 Oct 06 09:03:42 crc kubenswrapper[4610]: I1006 09:03:42.651860 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerDied","Data":"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40"} Oct 06 09:03:42 crc kubenswrapper[4610]: I1006 09:03:42.654297 4610 generic.go:334] "Generic (PLEG): container finished" podID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerID="2708de13c63f34b11f36d37cc0f621ef5cea534d9fe2a7e313116cff41296acf" exitCode=143 Oct 06 09:03:42 crc kubenswrapper[4610]: I1006 09:03:42.654439 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerDied","Data":"2708de13c63f34b11f36d37cc0f621ef5cea534d9fe2a7e313116cff41296acf"} Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.015794 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093079 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093182 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093239 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093357 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093398 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6tj\" (UniqueName: \"kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093433 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093501 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.093517 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs\") pod \"37d58e6c-0e5d-443a-af0a-595461252d4f\" (UID: \"37d58e6c-0e5d-443a-af0a-595461252d4f\") " Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.094031 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.095768 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.100077 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts" (OuterVolumeSpecName: "scripts") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.107425 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj" (OuterVolumeSpecName: "kube-api-access-fg6tj") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "kube-api-access-fg6tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.146699 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.196123 4610 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.196152 4610 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37d58e6c-0e5d-443a-af0a-595461252d4f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.196166 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6tj\" (UniqueName: \"kubernetes.io/projected/37d58e6c-0e5d-443a-af0a-595461252d4f-kube-api-access-fg6tj\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.196181 4610 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.196192 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.197216 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.216625 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.250110 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data" (OuterVolumeSpecName: "config-data") pod "37d58e6c-0e5d-443a-af0a-595461252d4f" (UID: "37d58e6c-0e5d-443a-af0a-595461252d4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.298185 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.298213 4610 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.298222 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37d58e6c-0e5d-443a-af0a-595461252d4f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.664180 4610 generic.go:334] "Generic (PLEG): container finished" podID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerID="74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab" exitCode=0 Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.664234 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.664233 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerDied","Data":"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab"} Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.664392 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37d58e6c-0e5d-443a-af0a-595461252d4f","Type":"ContainerDied","Data":"5eb759b364d8463b1001a113bad1ea2d0fa704e9268b9cbc3580a0d8a5e658cb"} Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.664430 4610 scope.go:117] "RemoveContainer" containerID="9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.748642 4610 scope.go:117] "RemoveContainer" containerID="7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.758667 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.768808 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.770102 4610 scope.go:117] "RemoveContainer" containerID="74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.787847 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.788208 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="proxy-httpd" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788224 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="proxy-httpd" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.788251 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-central-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788259 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-central-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.788284 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="sg-core" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788290 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="sg-core" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.788297 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-notification-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788304 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-notification-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788461 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-notification-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788482 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="sg-core" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788495 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="ceilometer-central-agent" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.788510 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" containerName="proxy-httpd" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.790527 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.792635 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.795170 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.800075 4610 scope.go:117] "RemoveContainer" containerID="275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.802137 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.826796 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.836210 4610 scope.go:117] "RemoveContainer" containerID="9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.838424 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4\": container with ID starting with 9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4 not found: ID does not exist" containerID="9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.838476 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4"} err="failed to get container status \"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4\": rpc error: code = NotFound desc = could not find container \"9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4\": container with ID starting with 9cb9b8ddb1422a75cd9a207db5a0a65191829e3d9998ce07dba2d7e82bac44e4 not found: ID does not exist" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.838510 4610 scope.go:117] "RemoveContainer" containerID="7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.838789 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7\": container with ID starting with 7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7 not found: ID does not exist" containerID="7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.838818 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7"} err="failed to get container status \"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7\": rpc error: code = NotFound desc = could not find container \"7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7\": container with ID starting with 7e1a5e44e060246a63a8bece89c1958839f75802bc411cb44693ce679ea70dc7 not found: ID does not exist" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.838838 4610 scope.go:117] "RemoveContainer" containerID="74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.839196 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab\": container with ID starting with 74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab not found: ID does not exist" containerID="74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.839222 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab"} err="failed to get container status \"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab\": rpc error: code = NotFound desc = could not find container \"74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab\": container with ID starting with 74af7c481ebf44adc63894301b29493ad0d6f9b566056e4cc3e297afaff8b8ab not found: ID does not exist" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.839242 4610 scope.go:117] "RemoveContainer" containerID="275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40" Oct 06 09:03:43 crc kubenswrapper[4610]: E1006 09:03:43.839491 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40\": container with ID starting with 275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40 not found: ID does not exist" containerID="275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.839511 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40"} err="failed to get container status \"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40\": rpc error: code = NotFound desc = could not find container \"275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40\": container with ID starting with 275bc9538c183ac0009b1432be1128a6c4a2a1a92e9af934d9310a60e22e7a40 not found: ID does not exist" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909209 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909261 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/2b6364c8-c83a-400c-85fb-52df075a07d4-kube-api-access-572kd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909302 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909325 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909475 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909538 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-scripts\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909689 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:43 crc kubenswrapper[4610]: I1006 09:03:43.909912 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-config-data\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.010908 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.010973 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011007 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011029 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-scripts\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011099 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011188 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-config-data\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011244 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.011275 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/2b6364c8-c83a-400c-85fb-52df075a07d4-kube-api-access-572kd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.012169 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.012267 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6364c8-c83a-400c-85fb-52df075a07d4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.016276 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.016511 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-scripts\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.018502 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-config-data\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.033301 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.033973 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6364c8-c83a-400c-85fb-52df075a07d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.035394 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572kd\" (UniqueName: \"kubernetes.io/projected/2b6364c8-c83a-400c-85fb-52df075a07d4-kube-api-access-572kd\") pod \"ceilometer-0\" (UID: \"2b6364c8-c83a-400c-85fb-52df075a07d4\") " pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.130805 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.589320 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 09:03:44 crc kubenswrapper[4610]: W1006 09:03:44.601918 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6364c8_c83a_400c_85fb_52df075a07d4.slice/crio-3e91e073714b4c5e640a4f633a6b8f07e2f1567d772630b4e091144fea9d0d3b WatchSource:0}: Error finding container 3e91e073714b4c5e640a4f633a6b8f07e2f1567d772630b4e091144fea9d0d3b: Status 404 returned error can't find the container with id 3e91e073714b4c5e640a4f633a6b8f07e2f1567d772630b4e091144fea9d0d3b Oct 06 09:03:44 crc kubenswrapper[4610]: I1006 09:03:44.678871 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6364c8-c83a-400c-85fb-52df075a07d4","Type":"ContainerStarted","Data":"3e91e073714b4c5e640a4f633a6b8f07e2f1567d772630b4e091144fea9d0d3b"} Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.022154 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.086118 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d58e6c-0e5d-443a-af0a-595461252d4f" path="/var/lib/kubelet/pods/37d58e6c-0e5d-443a-af0a-595461252d4f/volumes" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.107082 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.696321 4610 generic.go:334] "Generic (PLEG): container finished" podID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerID="d8d815815563192b338bf7a52350e5c32a830098c8c33ed51dce5da3ec1ce104" exitCode=0 Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.698278 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerDied","Data":"d8d815815563192b338bf7a52350e5c32a830098c8c33ed51dce5da3ec1ce104"} Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.755953 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.918001 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8vvmt"] Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.927386 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.929923 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.930106 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 09:03:45 crc kubenswrapper[4610]: I1006 09:03:45.948015 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vvmt"] Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.102944 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.103028 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.103123 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.103154 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhz6\" (UniqueName: \"kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.112473 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.204260 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.204324 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhz6\" (UniqueName: \"kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.204412 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.204474 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.211613 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.213639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.217715 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.227992 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhz6\" (UniqueName: \"kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6\") pod \"nova-cell1-cell-mapping-8vvmt\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.268451 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.306107 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs\") pod \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.306241 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") pod \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.306285 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle\") pod \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.306432 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqpl\" (UniqueName: \"kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl\") pod \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.309247 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs" (OuterVolumeSpecName: "logs") pod "c54e0d9b-8e85-4be5-a5f2-b709028c085f" (UID: "c54e0d9b-8e85-4be5-a5f2-b709028c085f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.312617 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl" (OuterVolumeSpecName: "kube-api-access-prqpl") pod "c54e0d9b-8e85-4be5-a5f2-b709028c085f" (UID: "c54e0d9b-8e85-4be5-a5f2-b709028c085f"). InnerVolumeSpecName "kube-api-access-prqpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:46 crc kubenswrapper[4610]: E1006 09:03:46.332449 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data podName:c54e0d9b-8e85-4be5-a5f2-b709028c085f nodeName:}" failed. No retries permitted until 2025-10-06 09:03:46.83240432 +0000 UTC m=+1358.547457708 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data") pod "c54e0d9b-8e85-4be5-a5f2-b709028c085f" (UID: "c54e0d9b-8e85-4be5-a5f2-b709028c085f") : error deleting /var/lib/kubelet/pods/c54e0d9b-8e85-4be5-a5f2-b709028c085f/volume-subpaths: remove /var/lib/kubelet/pods/c54e0d9b-8e85-4be5-a5f2-b709028c085f/volume-subpaths: no such file or directory Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.336157 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c54e0d9b-8e85-4be5-a5f2-b709028c085f" (UID: "c54e0d9b-8e85-4be5-a5f2-b709028c085f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.410094 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54e0d9b-8e85-4be5-a5f2-b709028c085f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.410383 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.410401 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqpl\" (UniqueName: \"kubernetes.io/projected/c54e0d9b-8e85-4be5-a5f2-b709028c085f-kube-api-access-prqpl\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.469505 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.469553 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.710441 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6364c8-c83a-400c-85fb-52df075a07d4","Type":"ContainerStarted","Data":"ba8257d9bb598261f07f87641699474ea2ad3732f88718d975b6c07a0a946843"} Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.710488 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6364c8-c83a-400c-85fb-52df075a07d4","Type":"ContainerStarted","Data":"8ab1f85de0dc01ad496714fcd0ed2b04905095f2f0b8ae781480f522c4df3ff4"} Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.714578 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.714728 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c54e0d9b-8e85-4be5-a5f2-b709028c085f","Type":"ContainerDied","Data":"dd0b8c14ebe25ec2f3d19fb08c071c436194e748456bcbbc1f1259d4c4f75731"} Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.714764 4610 scope.go:117] "RemoveContainer" containerID="d8d815815563192b338bf7a52350e5c32a830098c8c33ed51dce5da3ec1ce104" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.758145 4610 scope.go:117] "RemoveContainer" containerID="2708de13c63f34b11f36d37cc0f621ef5cea534d9fe2a7e313116cff41296acf" Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.820638 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vvmt"] Oct 06 09:03:46 crc kubenswrapper[4610]: W1006 09:03:46.829703 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod517bbf7b_880c_4564_b328_92d0bbf01003.slice/crio-cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df WatchSource:0}: Error finding container cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df: Status 404 returned error can't find the container with id cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.926218 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") pod \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\" (UID: \"c54e0d9b-8e85-4be5-a5f2-b709028c085f\") " Oct 06 09:03:46 crc kubenswrapper[4610]: I1006 09:03:46.930661 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data" (OuterVolumeSpecName: "config-data") pod "c54e0d9b-8e85-4be5-a5f2-b709028c085f" (UID: "c54e0d9b-8e85-4be5-a5f2-b709028c085f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.028331 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54e0d9b-8e85-4be5-a5f2-b709028c085f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.048864 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.056216 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.081921 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" path="/var/lib/kubelet/pods/c54e0d9b-8e85-4be5-a5f2-b709028c085f/volumes" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.082712 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:47 crc kubenswrapper[4610]: E1006 09:03:47.083085 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-api" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.083150 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-api" Oct 06 09:03:47 crc kubenswrapper[4610]: E1006 09:03:47.083229 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-log" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.083384 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-log" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.083751 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-log" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.083874 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54e0d9b-8e85-4be5-a5f2-b709028c085f" containerName="nova-api-api" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.089676 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.093400 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.093736 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.094210 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.104662 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231013 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231313 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231458 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp77h\" (UniqueName: \"kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231568 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231642 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.231758 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.333717 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp77h\" (UniqueName: \"kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.333954 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.334097 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.334251 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.334368 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.334522 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.335022 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.336879 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.337750 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.340820 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.341554 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.356656 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp77h\" (UniqueName: \"kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h\") pod \"nova-api-0\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.409069 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.724179 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6364c8-c83a-400c-85fb-52df075a07d4","Type":"ContainerStarted","Data":"dd379b5a3a2755ee98687bf421851af03b6176c0860f2de87f832fe22f21b7c6"} Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.728212 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vvmt" event={"ID":"517bbf7b-880c-4564-b328-92d0bbf01003","Type":"ContainerStarted","Data":"9e3198f305697609d3b465adfe366344753ac76f62aa4825f338c570a980de61"} Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.728272 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vvmt" event={"ID":"517bbf7b-880c-4564-b328-92d0bbf01003","Type":"ContainerStarted","Data":"cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df"} Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.744198 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8vvmt" podStartSLOduration=2.744181274 podStartE2EDuration="2.744181274s" podCreationTimestamp="2025-10-06 09:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:47.743124997 +0000 UTC m=+1359.458178395" watchObservedRunningTime="2025-10-06 09:03:47.744181274 +0000 UTC m=+1359.459234662" Oct 06 09:03:47 crc kubenswrapper[4610]: I1006 09:03:47.909955 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.747085 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerStarted","Data":"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2"} Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.747332 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerStarted","Data":"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176"} Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.747346 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerStarted","Data":"b0a31650973090fd878a0d1cde9ab85b75954fee1779c10dfcda18b84dd4895c"} Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.754204 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.779894 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7798742669999998 podStartE2EDuration="1.779874267s" podCreationTimestamp="2025-10-06 09:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:48.77284375 +0000 UTC m=+1360.487897158" watchObservedRunningTime="2025-10-06 09:03:48.779874267 +0000 UTC m=+1360.494927655" Oct 06 09:03:48 crc kubenswrapper[4610]: I1006 09:03:48.796761 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8627718149999999 podStartE2EDuration="5.796742741s" podCreationTimestamp="2025-10-06 09:03:43 +0000 UTC" firstStartedPulling="2025-10-06 09:03:44.608062185 +0000 UTC m=+1356.323115583" lastFinishedPulling="2025-10-06 09:03:48.542033121 +0000 UTC m=+1360.257086509" observedRunningTime="2025-10-06 09:03:48.788967385 +0000 UTC m=+1360.504020793" watchObservedRunningTime="2025-10-06 09:03:48.796742741 +0000 UTC m=+1360.511796129" Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.295721 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.380228 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.380784 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="dnsmasq-dns" containerID="cri-o://a5b2e9c60aa1f13426f74250ac90c5b3ba2d4414eb9d8daf9c9bae9a526d60ca" gracePeriod=10 Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.793938 4610 generic.go:334] "Generic (PLEG): container finished" podID="73c3f46d-9104-485f-8e6e-23b414a33760" containerID="a5b2e9c60aa1f13426f74250ac90c5b3ba2d4414eb9d8daf9c9bae9a526d60ca" exitCode=0 Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.794191 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" event={"ID":"73c3f46d-9104-485f-8e6e-23b414a33760","Type":"ContainerDied","Data":"a5b2e9c60aa1f13426f74250ac90c5b3ba2d4414eb9d8daf9c9bae9a526d60ca"} Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.797678 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6364c8-c83a-400c-85fb-52df075a07d4","Type":"ContainerStarted","Data":"f67bc0f023fb614f9c91a60817567f04fb92db37c1845569b7792b9f5343b54c"} Oct 06 09:03:49 crc kubenswrapper[4610]: I1006 09:03:49.931438 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096558 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096643 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096666 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096728 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096820 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.096845 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zktmm\" (UniqueName: \"kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm\") pod \"73c3f46d-9104-485f-8e6e-23b414a33760\" (UID: \"73c3f46d-9104-485f-8e6e-23b414a33760\") " Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.124845 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm" (OuterVolumeSpecName: "kube-api-access-zktmm") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "kube-api-access-zktmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.202633 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zktmm\" (UniqueName: \"kubernetes.io/projected/73c3f46d-9104-485f-8e6e-23b414a33760-kube-api-access-zktmm\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.215440 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.221524 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config" (OuterVolumeSpecName: "config") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.229615 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.257286 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.296504 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73c3f46d-9104-485f-8e6e-23b414a33760" (UID: "73c3f46d-9104-485f-8e6e-23b414a33760"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.304938 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.304974 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.304988 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.305000 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.305012 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c3f46d-9104-485f-8e6e-23b414a33760-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.806400 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" event={"ID":"73c3f46d-9104-485f-8e6e-23b414a33760","Type":"ContainerDied","Data":"d6dd02b4cd9541c04b57226a9238eac43fab096cb7e0e90f56236d4983c70f1f"} Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.806449 4610 scope.go:117] "RemoveContainer" containerID="a5b2e9c60aa1f13426f74250ac90c5b3ba2d4414eb9d8daf9c9bae9a526d60ca" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.806483 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-fpkrp" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.827508 4610 scope.go:117] "RemoveContainer" containerID="8be2e7ecda79f2b21dbce606dfcf749fe3edeb5ab7edfefa61f1a950a575a13e" Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.942478 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:03:50 crc kubenswrapper[4610]: I1006 09:03:50.955192 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-fpkrp"] Oct 06 09:03:51 crc kubenswrapper[4610]: I1006 09:03:51.080411 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" path="/var/lib/kubelet/pods/73c3f46d-9104-485f-8e6e-23b414a33760/volumes" Oct 06 09:03:52 crc kubenswrapper[4610]: I1006 09:03:52.834948 4610 generic.go:334] "Generic (PLEG): container finished" podID="517bbf7b-880c-4564-b328-92d0bbf01003" containerID="9e3198f305697609d3b465adfe366344753ac76f62aa4825f338c570a980de61" exitCode=0 Oct 06 09:03:52 crc kubenswrapper[4610]: I1006 09:03:52.835090 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vvmt" event={"ID":"517bbf7b-880c-4564-b328-92d0bbf01003","Type":"ContainerDied","Data":"9e3198f305697609d3b465adfe366344753ac76f62aa4825f338c570a980de61"} Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.162733 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.276150 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle\") pod \"517bbf7b-880c-4564-b328-92d0bbf01003\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.276410 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts\") pod \"517bbf7b-880c-4564-b328-92d0bbf01003\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.276450 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnhz6\" (UniqueName: \"kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6\") pod \"517bbf7b-880c-4564-b328-92d0bbf01003\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.276500 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data\") pod \"517bbf7b-880c-4564-b328-92d0bbf01003\" (UID: \"517bbf7b-880c-4564-b328-92d0bbf01003\") " Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.285165 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts" (OuterVolumeSpecName: "scripts") pod "517bbf7b-880c-4564-b328-92d0bbf01003" (UID: "517bbf7b-880c-4564-b328-92d0bbf01003"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.285156 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6" (OuterVolumeSpecName: "kube-api-access-jnhz6") pod "517bbf7b-880c-4564-b328-92d0bbf01003" (UID: "517bbf7b-880c-4564-b328-92d0bbf01003"). InnerVolumeSpecName "kube-api-access-jnhz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.326272 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "517bbf7b-880c-4564-b328-92d0bbf01003" (UID: "517bbf7b-880c-4564-b328-92d0bbf01003"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.330895 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data" (OuterVolumeSpecName: "config-data") pod "517bbf7b-880c-4564-b328-92d0bbf01003" (UID: "517bbf7b-880c-4564-b328-92d0bbf01003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.379238 4610 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.379290 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnhz6\" (UniqueName: \"kubernetes.io/projected/517bbf7b-880c-4564-b328-92d0bbf01003-kube-api-access-jnhz6\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.379301 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.379310 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517bbf7b-880c-4564-b328-92d0bbf01003-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.863340 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8vvmt" event={"ID":"517bbf7b-880c-4564-b328-92d0bbf01003","Type":"ContainerDied","Data":"cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df"} Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.863394 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd3343b0d36c7643fb991efe09a22bd49a240604563a23c34bd6d7a18b2b31df" Oct 06 09:03:54 crc kubenswrapper[4610]: I1006 09:03:54.863451 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8vvmt" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.103909 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.104816 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-log" containerID="cri-o://20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" gracePeriod=30 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.105468 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-api" containerID="cri-o://6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" gracePeriod=30 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.172287 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.172485 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" containerName="nova-scheduler-scheduler" containerID="cri-o://4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1" gracePeriod=30 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.186262 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.186845 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" containerID="cri-o://65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063" gracePeriod=30 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.187299 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" containerID="cri-o://dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1" gracePeriod=30 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.756873 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824462 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824657 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824708 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824737 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824772 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp77h\" (UniqueName: \"kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.824841 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs\") pod \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\" (UID: \"7ac3e64d-127a-4d83-8f5e-cb61c4195288\") " Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.825373 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs" (OuterVolumeSpecName: "logs") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.830670 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h" (OuterVolumeSpecName: "kube-api-access-rp77h") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "kube-api-access-rp77h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.855015 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.861003 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data" (OuterVolumeSpecName: "config-data") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.885443 4610 generic.go:334] "Generic (PLEG): container finished" podID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerID="65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063" exitCode=143 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.885952 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerDied","Data":"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063"} Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888737 4610 generic.go:334] "Generic (PLEG): container finished" podID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerID="6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" exitCode=0 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888757 4610 generic.go:334] "Generic (PLEG): container finished" podID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerID="20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" exitCode=143 Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888771 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888776 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerDied","Data":"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2"} Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888802 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerDied","Data":"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176"} Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888813 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ac3e64d-127a-4d83-8f5e-cb61c4195288","Type":"ContainerDied","Data":"b0a31650973090fd878a0d1cde9ab85b75954fee1779c10dfcda18b84dd4895c"} Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.888827 4610 scope.go:117] "RemoveContainer" containerID="6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.890166 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.908408 4610 scope.go:117] "RemoveContainer" containerID="20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.913374 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ac3e64d-127a-4d83-8f5e-cb61c4195288" (UID: "7ac3e64d-127a-4d83-8f5e-cb61c4195288"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927453 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927481 4610 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927491 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac3e64d-127a-4d83-8f5e-cb61c4195288-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927500 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp77h\" (UniqueName: \"kubernetes.io/projected/7ac3e64d-127a-4d83-8f5e-cb61c4195288-kube-api-access-rp77h\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927535 4610 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927543 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac3e64d-127a-4d83-8f5e-cb61c4195288-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.927774 4610 scope.go:117] "RemoveContainer" containerID="6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" Oct 06 09:03:55 crc kubenswrapper[4610]: E1006 09:03:55.928507 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2\": container with ID starting with 6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2 not found: ID does not exist" containerID="6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.928546 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2"} err="failed to get container status \"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2\": rpc error: code = NotFound desc = could not find container \"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2\": container with ID starting with 6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2 not found: ID does not exist" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.928573 4610 scope.go:117] "RemoveContainer" containerID="20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" Oct 06 09:03:55 crc kubenswrapper[4610]: E1006 09:03:55.928791 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176\": container with ID starting with 20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176 not found: ID does not exist" containerID="20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.928817 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176"} err="failed to get container status \"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176\": rpc error: code = NotFound desc = could not find container \"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176\": container with ID starting with 20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176 not found: ID does not exist" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.928835 4610 scope.go:117] "RemoveContainer" containerID="6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.929035 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2"} err="failed to get container status \"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2\": rpc error: code = NotFound desc = could not find container \"6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2\": container with ID starting with 6762dc407c31aec052f7d524a8d68fb205ea11541d51227b4e94b4caa5300cc2 not found: ID does not exist" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.929078 4610 scope.go:117] "RemoveContainer" containerID="20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176" Oct 06 09:03:55 crc kubenswrapper[4610]: I1006 09:03:55.929307 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176"} err="failed to get container status \"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176\": rpc error: code = NotFound desc = could not find container \"20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176\": container with ID starting with 20798daa3f1ff965bd3f29aaf28db51f34806e6820962be522aff9cc50822176 not found: ID does not exist" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.222028 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.231032 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.258256 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.258979 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="dnsmasq-dns" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.258994 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="dnsmasq-dns" Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.259017 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="init" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259036 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="init" Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.259072 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-log" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259082 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-log" Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.259099 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517bbf7b-880c-4564-b328-92d0bbf01003" containerName="nova-manage" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259109 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="517bbf7b-880c-4564-b328-92d0bbf01003" containerName="nova-manage" Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.259121 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-api" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259129 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-api" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259354 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c3f46d-9104-485f-8e6e-23b414a33760" containerName="dnsmasq-dns" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259384 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-api" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259398 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" containerName="nova-api-log" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.259411 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="517bbf7b-880c-4564-b328-92d0bbf01003" containerName="nova-manage" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.260682 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.274672 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.274924 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.275103 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.301455 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.337628 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c7e6ea-9091-4658-bc36-3c82f6f25682-logs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.337749 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-public-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.337873 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l552r\" (UniqueName: \"kubernetes.io/projected/49c7e6ea-9091-4658-bc36-3c82f6f25682-kube-api-access-l552r\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.337903 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.337971 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.338285 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-config-data\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439507 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c7e6ea-9091-4658-bc36-3c82f6f25682-logs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439613 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-public-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439683 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l552r\" (UniqueName: \"kubernetes.io/projected/49c7e6ea-9091-4658-bc36-3c82f6f25682-kube-api-access-l552r\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439712 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439750 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.439787 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-config-data\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.441384 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c7e6ea-9091-4658-bc36-3c82f6f25682-logs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.444911 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.450864 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-public-tls-certs\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.451216 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-config-data\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.451838 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c7e6ea-9091-4658-bc36-3c82f6f25682-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.457150 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l552r\" (UniqueName: \"kubernetes.io/projected/49c7e6ea-9091-4658-bc36-3c82f6f25682-kube-api-access-l552r\") pod \"nova-api-0\" (UID: \"49c7e6ea-9091-4658-bc36-3c82f6f25682\") " pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.541199 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.611023 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.643271 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle\") pod \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.643717 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data\") pod \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.644105 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vf6q\" (UniqueName: \"kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q\") pod \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\" (UID: \"794a334b-382f-4fbc-b61f-11e5f9afb1c4\") " Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.649146 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q" (OuterVolumeSpecName: "kube-api-access-6vf6q") pod "794a334b-382f-4fbc-b61f-11e5f9afb1c4" (UID: "794a334b-382f-4fbc-b61f-11e5f9afb1c4"). InnerVolumeSpecName "kube-api-access-6vf6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.675985 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data" (OuterVolumeSpecName: "config-data") pod "794a334b-382f-4fbc-b61f-11e5f9afb1c4" (UID: "794a334b-382f-4fbc-b61f-11e5f9afb1c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.675959 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794a334b-382f-4fbc-b61f-11e5f9afb1c4" (UID: "794a334b-382f-4fbc-b61f-11e5f9afb1c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.746219 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vf6q\" (UniqueName: \"kubernetes.io/projected/794a334b-382f-4fbc-b61f-11e5f9afb1c4-kube-api-access-6vf6q\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.746258 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.746269 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794a334b-382f-4fbc-b61f-11e5f9afb1c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.903863 4610 generic.go:334] "Generic (PLEG): container finished" podID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" containerID="4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1" exitCode=0 Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.903908 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794a334b-382f-4fbc-b61f-11e5f9afb1c4","Type":"ContainerDied","Data":"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1"} Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.903939 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794a334b-382f-4fbc-b61f-11e5f9afb1c4","Type":"ContainerDied","Data":"8925b2488dd015306323299cde0350295fb34facd65dcd0154e1cdad9afb0f93"} Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.903958 4610 scope.go:117] "RemoveContainer" containerID="4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.904019 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.937537 4610 scope.go:117] "RemoveContainer" containerID="4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1" Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.938225 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1\": container with ID starting with 4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1 not found: ID does not exist" containerID="4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.938263 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1"} err="failed to get container status \"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1\": rpc error: code = NotFound desc = could not find container \"4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1\": container with ID starting with 4da620083bdd2cbd2b908e2105501c639a52175db5a53458e75db78c5b1185e1 not found: ID does not exist" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.939519 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.956151 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.968579 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:56 crc kubenswrapper[4610]: E1006 09:03:56.968992 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" containerName="nova-scheduler-scheduler" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.969012 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" containerName="nova-scheduler-scheduler" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.969207 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" containerName="nova-scheduler-scheduler" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.969808 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.971467 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 09:03:56 crc kubenswrapper[4610]: I1006 09:03:56.979601 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.050940 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-config-data\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.051094 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zlx\" (UniqueName: \"kubernetes.io/projected/272d54f3-da6d-4d44-b723-956ac2cc65a4-kube-api-access-j4zlx\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.051414 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: W1006 09:03:57.077436 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c7e6ea_9091_4658_bc36_3c82f6f25682.slice/crio-4bbca2c6735b39ec4b1e536a9c6345364ac2e154c7d457bf370dd9c009b2b3f2 WatchSource:0}: Error finding container 4bbca2c6735b39ec4b1e536a9c6345364ac2e154c7d457bf370dd9c009b2b3f2: Status 404 returned error can't find the container with id 4bbca2c6735b39ec4b1e536a9c6345364ac2e154c7d457bf370dd9c009b2b3f2 Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.090376 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794a334b-382f-4fbc-b61f-11e5f9afb1c4" path="/var/lib/kubelet/pods/794a334b-382f-4fbc-b61f-11e5f9afb1c4/volumes" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.091531 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac3e64d-127a-4d83-8f5e-cb61c4195288" path="/var/lib/kubelet/pods/7ac3e64d-127a-4d83-8f5e-cb61c4195288/volumes" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.093356 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.152606 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zlx\" (UniqueName: \"kubernetes.io/projected/272d54f3-da6d-4d44-b723-956ac2cc65a4-kube-api-access-j4zlx\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.152748 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.152792 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-config-data\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.160201 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-config-data\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.162273 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272d54f3-da6d-4d44-b723-956ac2cc65a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.171441 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zlx\" (UniqueName: \"kubernetes.io/projected/272d54f3-da6d-4d44-b723-956ac2cc65a4-kube-api-access-j4zlx\") pod \"nova-scheduler-0\" (UID: \"272d54f3-da6d-4d44-b723-956ac2cc65a4\") " pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.286643 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.765768 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 09:03:57 crc kubenswrapper[4610]: W1006 09:03:57.771412 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272d54f3_da6d_4d44_b723_956ac2cc65a4.slice/crio-49326d68424ac4dfaa529921a6b4b414117d79d1c8ee4f7c875109d3e8478182 WatchSource:0}: Error finding container 49326d68424ac4dfaa529921a6b4b414117d79d1c8ee4f7c875109d3e8478182: Status 404 returned error can't find the container with id 49326d68424ac4dfaa529921a6b4b414117d79d1c8ee4f7c875109d3e8478182 Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.917440 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c7e6ea-9091-4658-bc36-3c82f6f25682","Type":"ContainerStarted","Data":"a8acc4d137b75d0509f21efce12e43747e39ae36bdfb02608f7cf5067df68846"} Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.917713 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c7e6ea-9091-4658-bc36-3c82f6f25682","Type":"ContainerStarted","Data":"1acd99269dc0be75dd5d45e41328ca342f019ac1cfdac5d1c3bade6bfc3baf30"} Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.917727 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c7e6ea-9091-4658-bc36-3c82f6f25682","Type":"ContainerStarted","Data":"4bbca2c6735b39ec4b1e536a9c6345364ac2e154c7d457bf370dd9c009b2b3f2"} Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.919281 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"272d54f3-da6d-4d44-b723-956ac2cc65a4","Type":"ContainerStarted","Data":"49326d68424ac4dfaa529921a6b4b414117d79d1c8ee4f7c875109d3e8478182"} Oct 06 09:03:57 crc kubenswrapper[4610]: I1006 09:03:57.943742 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.943724819 podStartE2EDuration="1.943724819s" podCreationTimestamp="2025-10-06 09:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:57.93700651 +0000 UTC m=+1369.652059928" watchObservedRunningTime="2025-10-06 09:03:57.943724819 +0000 UTC m=+1369.658778207" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.340339 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:35976->10.217.0.196:8775: read: connection reset by peer" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.341375 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:35978->10.217.0.196:8775: read: connection reset by peer" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.843398 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.930842 4610 generic.go:334] "Generic (PLEG): container finished" podID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerID="dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1" exitCode=0 Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.930892 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.930910 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerDied","Data":"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1"} Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.930938 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"637f54e5-73e3-4944-bfcc-076a768c34bc","Type":"ContainerDied","Data":"9358d8f80c1caa5088e21d7f907fa441b9a20f9c2b09f8e268338a024ff067b8"} Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.930954 4610 scope.go:117] "RemoveContainer" containerID="dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.936111 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"272d54f3-da6d-4d44-b723-956ac2cc65a4","Type":"ContainerStarted","Data":"bfa337652289723a4fd8f55ac43b7ee13cb6b559585be9db65fec4de8ea3f9b2"} Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.955066 4610 scope.go:117] "RemoveContainer" containerID="65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.971324 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9713020180000003 podStartE2EDuration="2.971302018s" podCreationTimestamp="2025-10-06 09:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:03:58.954030074 +0000 UTC m=+1370.669083482" watchObservedRunningTime="2025-10-06 09:03:58.971302018 +0000 UTC m=+1370.686355406" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.985205 4610 scope.go:117] "RemoveContainer" containerID="dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1" Oct 06 09:03:58 crc kubenswrapper[4610]: E1006 09:03:58.985600 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1\": container with ID starting with dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1 not found: ID does not exist" containerID="dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.985629 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1"} err="failed to get container status \"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1\": rpc error: code = NotFound desc = could not find container \"dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1\": container with ID starting with dd789f8c3a57d0d677ae05994c9e837d74c222c3ed6de0b3bdf797d32d7467c1 not found: ID does not exist" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.985649 4610 scope.go:117] "RemoveContainer" containerID="65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063" Oct 06 09:03:58 crc kubenswrapper[4610]: E1006 09:03:58.986134 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063\": container with ID starting with 65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063 not found: ID does not exist" containerID="65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.986156 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063"} err="failed to get container status \"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063\": rpc error: code = NotFound desc = could not find container \"65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063\": container with ID starting with 65af6ac20b6052c07783a3e43b52ccfbd6a7d22c141ba074233ad3fac03cc063 not found: ID does not exist" Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.988856 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle\") pod \"637f54e5-73e3-4944-bfcc-076a768c34bc\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.988904 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4j28\" (UniqueName: \"kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28\") pod \"637f54e5-73e3-4944-bfcc-076a768c34bc\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.988986 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs\") pod \"637f54e5-73e3-4944-bfcc-076a768c34bc\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.989071 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data\") pod \"637f54e5-73e3-4944-bfcc-076a768c34bc\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.989178 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs\") pod \"637f54e5-73e3-4944-bfcc-076a768c34bc\" (UID: \"637f54e5-73e3-4944-bfcc-076a768c34bc\") " Oct 06 09:03:58 crc kubenswrapper[4610]: I1006 09:03:58.990148 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs" (OuterVolumeSpecName: "logs") pod "637f54e5-73e3-4944-bfcc-076a768c34bc" (UID: "637f54e5-73e3-4944-bfcc-076a768c34bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.004239 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28" (OuterVolumeSpecName: "kube-api-access-l4j28") pod "637f54e5-73e3-4944-bfcc-076a768c34bc" (UID: "637f54e5-73e3-4944-bfcc-076a768c34bc"). InnerVolumeSpecName "kube-api-access-l4j28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.024339 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data" (OuterVolumeSpecName: "config-data") pod "637f54e5-73e3-4944-bfcc-076a768c34bc" (UID: "637f54e5-73e3-4944-bfcc-076a768c34bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.032785 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "637f54e5-73e3-4944-bfcc-076a768c34bc" (UID: "637f54e5-73e3-4944-bfcc-076a768c34bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.057944 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "637f54e5-73e3-4944-bfcc-076a768c34bc" (UID: "637f54e5-73e3-4944-bfcc-076a768c34bc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.090959 4610 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/637f54e5-73e3-4944-bfcc-076a768c34bc-logs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.090993 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.091004 4610 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.091012 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637f54e5-73e3-4944-bfcc-076a768c34bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.091023 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4j28\" (UniqueName: \"kubernetes.io/projected/637f54e5-73e3-4944-bfcc-076a768c34bc-kube-api-access-l4j28\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.290844 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.297420 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.314267 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:59 crc kubenswrapper[4610]: E1006 09:03:59.315012 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.315032 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" Oct 06 09:03:59 crc kubenswrapper[4610]: E1006 09:03:59.315061 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.315067 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.315421 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-log" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.315444 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" containerName="nova-metadata-metadata" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.318369 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.320607 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.323390 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.329089 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.395999 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-config-data\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.396124 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rnt\" (UniqueName: \"kubernetes.io/projected/65ccdb5a-a886-4df8-9f4c-9bccb814641a-kube-api-access-m7rnt\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.396159 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ccdb5a-a886-4df8-9f4c-9bccb814641a-logs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.396371 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.396425 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.498207 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.498293 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-config-data\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.498357 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rnt\" (UniqueName: \"kubernetes.io/projected/65ccdb5a-a886-4df8-9f4c-9bccb814641a-kube-api-access-m7rnt\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.498379 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ccdb5a-a886-4df8-9f4c-9bccb814641a-logs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.498439 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.499019 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ccdb5a-a886-4df8-9f4c-9bccb814641a-logs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.502705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.502759 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.507713 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ccdb5a-a886-4df8-9f4c-9bccb814641a-config-data\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.527079 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rnt\" (UniqueName: \"kubernetes.io/projected/65ccdb5a-a886-4df8-9f4c-9bccb814641a-kube-api-access-m7rnt\") pod \"nova-metadata-0\" (UID: \"65ccdb5a-a886-4df8-9f4c-9bccb814641a\") " pod="openstack/nova-metadata-0" Oct 06 09:03:59 crc kubenswrapper[4610]: I1006 09:03:59.642115 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 09:04:00 crc kubenswrapper[4610]: I1006 09:04:00.274876 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 09:04:00 crc kubenswrapper[4610]: I1006 09:04:00.978361 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ccdb5a-a886-4df8-9f4c-9bccb814641a","Type":"ContainerStarted","Data":"aeba5ca98d8af70c838911b1871941aef10db784c9f8e075ba8343e248a7c6d1"} Oct 06 09:04:00 crc kubenswrapper[4610]: I1006 09:04:00.978658 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ccdb5a-a886-4df8-9f4c-9bccb814641a","Type":"ContainerStarted","Data":"28c212960e9b64238d948ca2333a26d395ae566d41874ef502a4e99dfa43985b"} Oct 06 09:04:00 crc kubenswrapper[4610]: I1006 09:04:00.978669 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"65ccdb5a-a886-4df8-9f4c-9bccb814641a","Type":"ContainerStarted","Data":"13bd72ec7d0bdd685b3cfb9577565926a2b60dc8233a23723eee8b006ce64230"} Oct 06 09:04:01 crc kubenswrapper[4610]: I1006 09:04:01.007722 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.007687855 podStartE2EDuration="2.007687855s" podCreationTimestamp="2025-10-06 09:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:04:00.993859977 +0000 UTC m=+1372.708913375" watchObservedRunningTime="2025-10-06 09:04:01.007687855 +0000 UTC m=+1372.722741273" Oct 06 09:04:01 crc kubenswrapper[4610]: I1006 09:04:01.079991 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637f54e5-73e3-4944-bfcc-076a768c34bc" path="/var/lib/kubelet/pods/637f54e5-73e3-4944-bfcc-076a768c34bc/volumes" Oct 06 09:04:02 crc kubenswrapper[4610]: I1006 09:04:02.287789 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 09:04:04 crc kubenswrapper[4610]: I1006 09:04:04.642888 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:04:04 crc kubenswrapper[4610]: I1006 09:04:04.643182 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 09:04:06 crc kubenswrapper[4610]: I1006 09:04:06.612638 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:04:06 crc kubenswrapper[4610]: I1006 09:04:06.614811 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 09:04:07 crc kubenswrapper[4610]: I1006 09:04:07.287556 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 09:04:07 crc kubenswrapper[4610]: I1006 09:04:07.321477 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 09:04:07 crc kubenswrapper[4610]: I1006 09:04:07.623258 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49c7e6ea-9091-4658-bc36-3c82f6f25682" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:04:07 crc kubenswrapper[4610]: I1006 09:04:07.623640 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49c7e6ea-9091-4658-bc36-3c82f6f25682" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:04:08 crc kubenswrapper[4610]: I1006 09:04:08.096638 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 09:04:09 crc kubenswrapper[4610]: I1006 09:04:09.642594 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 09:04:09 crc kubenswrapper[4610]: I1006 09:04:09.643412 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 09:04:10 crc kubenswrapper[4610]: I1006 09:04:10.660249 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="65ccdb5a-a886-4df8-9f4c-9bccb814641a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:04:10 crc kubenswrapper[4610]: I1006 09:04:10.660258 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="65ccdb5a-a886-4df8-9f4c-9bccb814641a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 09:04:14 crc kubenswrapper[4610]: I1006 09:04:14.148857 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.468678 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.468935 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.468978 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.469626 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.469674 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659" gracePeriod=600 Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.625214 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.625609 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.636774 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 09:04:16 crc kubenswrapper[4610]: I1006 09:04:16.637835 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.160670 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659"} Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.160937 4610 scope.go:117] "RemoveContainer" containerID="2a03a6c0215984950d574d138749aa7d53fb617a66262307cd832997f9be78d9" Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.160636 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659" exitCode=0 Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.160986 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95"} Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.161497 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 09:04:17 crc kubenswrapper[4610]: I1006 09:04:17.177805 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.961100 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.964705 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.977136 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.978215 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmjq7\" (UniqueName: \"kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.978466 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:18 crc kubenswrapper[4610]: I1006 09:04:18.978502 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.080467 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.080520 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.080592 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmjq7\" (UniqueName: \"kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.081216 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.082202 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.101176 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmjq7\" (UniqueName: \"kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7\") pod \"redhat-operators-zzfqg\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.295059 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.647376 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.648185 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.654001 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 09:04:19 crc kubenswrapper[4610]: I1006 09:04:19.755846 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:04:20 crc kubenswrapper[4610]: I1006 09:04:20.202747 4610 generic.go:334] "Generic (PLEG): container finished" podID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerID="787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530" exitCode=0 Oct 06 09:04:20 crc kubenswrapper[4610]: I1006 09:04:20.203219 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerDied","Data":"787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530"} Oct 06 09:04:20 crc kubenswrapper[4610]: I1006 09:04:20.203285 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerStarted","Data":"012fbdb9dd7b28d6149175dcc1901b55d488222ef04b23041ddb6070caa5d0b6"} Oct 06 09:04:20 crc kubenswrapper[4610]: I1006 09:04:20.221960 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 09:04:22 crc kubenswrapper[4610]: I1006 09:04:22.227248 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerStarted","Data":"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256"} Oct 06 09:04:25 crc kubenswrapper[4610]: I1006 09:04:25.254545 4610 generic.go:334] "Generic (PLEG): container finished" podID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerID="f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256" exitCode=0 Oct 06 09:04:25 crc kubenswrapper[4610]: I1006 09:04:25.254632 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerDied","Data":"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256"} Oct 06 09:04:26 crc kubenswrapper[4610]: I1006 09:04:26.269493 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerStarted","Data":"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18"} Oct 06 09:04:26 crc kubenswrapper[4610]: I1006 09:04:26.298754 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zzfqg" podStartSLOduration=2.800530525 podStartE2EDuration="8.298734014s" podCreationTimestamp="2025-10-06 09:04:18 +0000 UTC" firstStartedPulling="2025-10-06 09:04:20.205474323 +0000 UTC m=+1391.920527731" lastFinishedPulling="2025-10-06 09:04:25.703677802 +0000 UTC m=+1397.418731220" observedRunningTime="2025-10-06 09:04:26.290387694 +0000 UTC m=+1398.005441092" watchObservedRunningTime="2025-10-06 09:04:26.298734014 +0000 UTC m=+1398.013787412" Oct 06 09:04:28 crc kubenswrapper[4610]: I1006 09:04:28.201957 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:29 crc kubenswrapper[4610]: I1006 09:04:29.101924 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:29 crc kubenswrapper[4610]: I1006 09:04:29.297211 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:29 crc kubenswrapper[4610]: I1006 09:04:29.297255 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:30 crc kubenswrapper[4610]: I1006 09:04:30.425193 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zzfqg" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" probeResult="failure" output=< Oct 06 09:04:30 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:04:30 crc kubenswrapper[4610]: > Oct 06 09:04:33 crc kubenswrapper[4610]: I1006 09:04:33.168578 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="rabbitmq" containerID="cri-o://f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4" gracePeriod=604796 Oct 06 09:04:33 crc kubenswrapper[4610]: I1006 09:04:33.337470 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="rabbitmq" containerID="cri-o://c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7" gracePeriod=604796 Oct 06 09:04:34 crc kubenswrapper[4610]: I1006 09:04:34.686756 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Oct 06 09:04:35 crc kubenswrapper[4610]: I1006 09:04:35.105258 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 06 09:04:39 crc kubenswrapper[4610]: I1006 09:04:39.976904 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 09:04:39 crc kubenswrapper[4610]: I1006 09:04:39.984184 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.112860 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113241 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb4s7\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113316 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113340 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113361 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113421 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113459 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113494 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113537 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blwpn\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113587 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113613 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113654 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113722 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113751 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113784 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113815 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113840 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113871 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113902 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113927 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf\") pod \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\" (UID: \"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.113979 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.114003 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf\") pod \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\" (UID: \"764e6cbc-bf6c-4120-9e38-cf70e046dcf8\") " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.116178 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.124657 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.129107 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.129634 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.133224 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7" (OuterVolumeSpecName: "kube-api-access-gb4s7") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "kube-api-access-gb4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.140292 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.141106 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.142067 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.142971 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.166831 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.166840 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn" (OuterVolumeSpecName: "kube-api-access-blwpn") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "kube-api-access-blwpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.166932 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info" (OuterVolumeSpecName: "pod-info") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.167674 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.172750 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info" (OuterVolumeSpecName: "pod-info") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.172868 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.180372 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223325 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb4s7\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-kube-api-access-gb4s7\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223348 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223358 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223367 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blwpn\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-kube-api-access-blwpn\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223376 4610 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223384 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223394 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223403 4610 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223411 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223419 4610 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223446 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223456 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223465 4610 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223473 4610 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223496 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.223505 4610 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.251706 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data" (OuterVolumeSpecName: "config-data") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.260136 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf" (OuterVolumeSpecName: "server-conf") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.291562 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.312865 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data" (OuterVolumeSpecName: "config-data") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.325287 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.325319 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.325328 4610 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.325336 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.333518 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.347772 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zzfqg" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" probeResult="failure" output=< Oct 06 09:04:40 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:04:40 crc kubenswrapper[4610]: > Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.362299 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf" (OuterVolumeSpecName: "server-conf") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.407231 4610 generic.go:334] "Generic (PLEG): container finished" podID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerID="f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4" exitCode=0 Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.407302 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerDied","Data":"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4"} Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.407330 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2adc9dee-eebc-4fec-9af7-ecdcbf1136f3","Type":"ContainerDied","Data":"a45e3ab8c4affcccd5400ecdc14efafd3cc9b133afd200e04274d5d987830163"} Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.407346 4610 scope.go:117] "RemoveContainer" containerID="f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.407474 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.415871 4610 generic.go:334] "Generic (PLEG): container finished" podID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerID="c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7" exitCode=0 Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.415913 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerDied","Data":"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7"} Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.415953 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"764e6cbc-bf6c-4120-9e38-cf70e046dcf8","Type":"ContainerDied","Data":"c30d30d4885cdcfa10f02d477d64007b9dd5ad09557d6801298e0618d8f91d9c"} Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.416019 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.431868 4610 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.431913 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.442254 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" (UID: "2adc9dee-eebc-4fec-9af7-ecdcbf1136f3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.454710 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "764e6cbc-bf6c-4120-9e38-cf70e046dcf8" (UID: "764e6cbc-bf6c-4120-9e38-cf70e046dcf8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.514511 4610 scope.go:117] "RemoveContainer" containerID="435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.535365 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/764e6cbc-bf6c-4120-9e38-cf70e046dcf8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.535401 4610 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.539735 4610 scope.go:117] "RemoveContainer" containerID="f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.540214 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4\": container with ID starting with f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4 not found: ID does not exist" containerID="f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.540245 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4"} err="failed to get container status \"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4\": rpc error: code = NotFound desc = could not find container \"f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4\": container with ID starting with f50edcf3a4519c76cfb3af9bb5f07d5b5e77705cde7a2870f51be4976a8f5ee4 not found: ID does not exist" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.540266 4610 scope.go:117] "RemoveContainer" containerID="435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.540566 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f\": container with ID starting with 435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f not found: ID does not exist" containerID="435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.540586 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f"} err="failed to get container status \"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f\": rpc error: code = NotFound desc = could not find container \"435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f\": container with ID starting with 435091d3d52444d457269b77db0acfb37f154032ad37680f1fe3e3a13a8a556f not found: ID does not exist" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.540600 4610 scope.go:117] "RemoveContainer" containerID="c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.586547 4610 scope.go:117] "RemoveContainer" containerID="dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.620835 4610 scope.go:117] "RemoveContainer" containerID="c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.621297 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7\": container with ID starting with c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7 not found: ID does not exist" containerID="c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.621340 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7"} err="failed to get container status \"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7\": rpc error: code = NotFound desc = could not find container \"c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7\": container with ID starting with c997d6d7b8d0fcb82d6e2bc56d8e3c713bda0efd33fe87065554fe04fdd485e7 not found: ID does not exist" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.621363 4610 scope.go:117] "RemoveContainer" containerID="dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.621748 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec\": container with ID starting with dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec not found: ID does not exist" containerID="dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.621767 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec"} err="failed to get container status \"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec\": rpc error: code = NotFound desc = could not find container \"dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec\": container with ID starting with dad6cd398537ed841e94c0a5f167c4d34e3f123e03ff40d050b410445b173fec not found: ID does not exist" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.780778 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.789468 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.813731 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.821560 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833013 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.833378 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833394 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.833409 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="setup-container" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833415 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="setup-container" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.833448 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833455 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: E1006 09:04:40.833463 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="setup-container" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833469 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="setup-container" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833645 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.833657 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" containerName="rabbitmq" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.834621 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.838849 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839007 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839129 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839270 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839371 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839450 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bwq8w" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.839780 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.844343 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.847371 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.852590 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.852887 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-th5dt" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.853059 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.853188 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.853438 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.853453 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.853602 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.869015 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.879029 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.944931 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.944975 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945018 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945058 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945205 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945250 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945318 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945339 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945358 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945425 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945451 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/099c0f32-ad2c-4b69-a308-f46f3dbab2be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945487 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945503 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/060bb971-d347-44c3-b9ce-6c06c13bcb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945522 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945541 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/099c0f32-ad2c-4b69-a308-f46f3dbab2be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945559 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945623 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945700 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtpf\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-kube-api-access-vmtpf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945760 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945879 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742vm\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-kube-api-access-742vm\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.945912 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/060bb971-d347-44c3-b9ce-6c06c13bcb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:40 crc kubenswrapper[4610]: I1006 09:04:40.946005 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048257 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742vm\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-kube-api-access-742vm\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048303 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/060bb971-d347-44c3-b9ce-6c06c13bcb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048331 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048388 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048405 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048435 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048457 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048484 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048501 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048528 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048547 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048567 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048595 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048615 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/099c0f32-ad2c-4b69-a308-f46f3dbab2be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048643 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048663 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/060bb971-d347-44c3-b9ce-6c06c13bcb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048681 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048696 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/099c0f32-ad2c-4b69-a308-f46f3dbab2be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048713 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048747 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048786 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtpf\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-kube-api-access-vmtpf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.048814 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.049491 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.050308 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.050562 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.050583 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.050936 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.051433 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.051627 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.052023 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099c0f32-ad2c-4b69-a308-f46f3dbab2be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.052263 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-config-data\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.053514 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/060bb971-d347-44c3-b9ce-6c06c13bcb51-server-conf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.053614 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.058391 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/060bb971-d347-44c3-b9ce-6c06c13bcb51-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.061573 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/060bb971-d347-44c3-b9ce-6c06c13bcb51-pod-info\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.063831 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.064160 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.064734 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.066007 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.066407 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/099c0f32-ad2c-4b69-a308-f46f3dbab2be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.069331 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742vm\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-kube-api-access-742vm\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.075719 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/099c0f32-ad2c-4b69-a308-f46f3dbab2be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.082493 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adc9dee-eebc-4fec-9af7-ecdcbf1136f3" path="/var/lib/kubelet/pods/2adc9dee-eebc-4fec-9af7-ecdcbf1136f3/volumes" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.083329 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="764e6cbc-bf6c-4120-9e38-cf70e046dcf8" path="/var/lib/kubelet/pods/764e6cbc-bf6c-4120-9e38-cf70e046dcf8/volumes" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.083353 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/099c0f32-ad2c-4b69-a308-f46f3dbab2be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.083817 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtpf\" (UniqueName: \"kubernetes.io/projected/060bb971-d347-44c3-b9ce-6c06c13bcb51-kube-api-access-vmtpf\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.107549 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"099c0f32-ad2c-4b69-a308-f46f3dbab2be\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.111525 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"060bb971-d347-44c3-b9ce-6c06c13bcb51\") " pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.167906 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.174874 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.414632 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.416616 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.442211 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.455238 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g62s\" (UniqueName: \"kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.455330 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.455413 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.555838 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.555946 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.555974 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g62s\" (UniqueName: \"kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.556675 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.557161 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.580406 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g62s\" (UniqueName: \"kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s\") pod \"certified-operators-2d799\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.706966 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.729462 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 09:04:41 crc kubenswrapper[4610]: W1006 09:04:41.741660 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod099c0f32_ad2c_4b69_a308_f46f3dbab2be.slice/crio-7d82c357d3ddc091a50e34fa090241283b052876f7f9a9ae4843b030323f4e86 WatchSource:0}: Error finding container 7d82c357d3ddc091a50e34fa090241283b052876f7f9a9ae4843b030323f4e86: Status 404 returned error can't find the container with id 7d82c357d3ddc091a50e34fa090241283b052876f7f9a9ae4843b030323f4e86 Oct 06 09:04:41 crc kubenswrapper[4610]: I1006 09:04:41.765573 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:42 crc kubenswrapper[4610]: I1006 09:04:42.251605 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:42 crc kubenswrapper[4610]: W1006 09:04:42.260527 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c65756c_04f9_4ee3_90fb_e57fcff2150b.slice/crio-1864650a4f4a20736888add8df8d9fdcc8ac648cd0e39a4b435835343d422f19 WatchSource:0}: Error finding container 1864650a4f4a20736888add8df8d9fdcc8ac648cd0e39a4b435835343d422f19: Status 404 returned error can't find the container with id 1864650a4f4a20736888add8df8d9fdcc8ac648cd0e39a4b435835343d422f19 Oct 06 09:04:42 crc kubenswrapper[4610]: I1006 09:04:42.463280 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"060bb971-d347-44c3-b9ce-6c06c13bcb51","Type":"ContainerStarted","Data":"be7e13bdf2c192ec2d60bffaccf7608fa1e7fdb8ce5f5d2599cff371037c562d"} Oct 06 09:04:42 crc kubenswrapper[4610]: I1006 09:04:42.465106 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerStarted","Data":"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c"} Oct 06 09:04:42 crc kubenswrapper[4610]: I1006 09:04:42.465283 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerStarted","Data":"1864650a4f4a20736888add8df8d9fdcc8ac648cd0e39a4b435835343d422f19"} Oct 06 09:04:42 crc kubenswrapper[4610]: I1006 09:04:42.470401 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"099c0f32-ad2c-4b69-a308-f46f3dbab2be","Type":"ContainerStarted","Data":"7d82c357d3ddc091a50e34fa090241283b052876f7f9a9ae4843b030323f4e86"} Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.480825 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"060bb971-d347-44c3-b9ce-6c06c13bcb51","Type":"ContainerStarted","Data":"b955ec40defc5de4c039348ac943d9c5c34d829221d1d667c0b99c1c5ac2dd7d"} Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.482350 4610 generic.go:334] "Generic (PLEG): container finished" podID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerID="ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c" exitCode=0 Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.482445 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerDied","Data":"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c"} Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.486114 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"099c0f32-ad2c-4b69-a308-f46f3dbab2be","Type":"ContainerStarted","Data":"07b3d9fabae0e21ef224106dbcb4e4ca4042b18c4c9f4bcfae2034459e2b8726"} Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.627318 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.629310 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.631191 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.658189 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709109 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709164 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709183 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709233 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709411 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwj2\" (UniqueName: \"kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709464 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.709519 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811530 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811589 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811628 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811690 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811769 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwj2\" (UniqueName: \"kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811816 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.811881 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.812781 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.812818 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.812844 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.813022 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.813354 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.813656 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.830859 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwj2\" (UniqueName: \"kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2\") pod \"dnsmasq-dns-5576978c7c-r8l6z\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:43 crc kubenswrapper[4610]: I1006 09:04:43.954525 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:44 crc kubenswrapper[4610]: I1006 09:04:44.497315 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerStarted","Data":"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753"} Oct 06 09:04:44 crc kubenswrapper[4610]: I1006 09:04:44.499025 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" event={"ID":"2c61f8a2-8bb2-4f82-a997-42d6218b0e76","Type":"ContainerStarted","Data":"f2a135156e6fc688fa3e51cce14dca70f63073248aa9e4b753e8d5303c5cd742"} Oct 06 09:04:44 crc kubenswrapper[4610]: I1006 09:04:44.500022 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:04:45 crc kubenswrapper[4610]: I1006 09:04:45.508609 4610 generic.go:334] "Generic (PLEG): container finished" podID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerID="827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d" exitCode=0 Oct 06 09:04:45 crc kubenswrapper[4610]: I1006 09:04:45.508698 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" event={"ID":"2c61f8a2-8bb2-4f82-a997-42d6218b0e76","Type":"ContainerDied","Data":"827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d"} Oct 06 09:04:46 crc kubenswrapper[4610]: I1006 09:04:46.529997 4610 generic.go:334] "Generic (PLEG): container finished" podID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerID="3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753" exitCode=0 Oct 06 09:04:46 crc kubenswrapper[4610]: I1006 09:04:46.530372 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerDied","Data":"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753"} Oct 06 09:04:46 crc kubenswrapper[4610]: I1006 09:04:46.535801 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" event={"ID":"2c61f8a2-8bb2-4f82-a997-42d6218b0e76","Type":"ContainerStarted","Data":"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce"} Oct 06 09:04:46 crc kubenswrapper[4610]: I1006 09:04:46.536216 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:46 crc kubenswrapper[4610]: I1006 09:04:46.569477 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" podStartSLOduration=3.569459451 podStartE2EDuration="3.569459451s" podCreationTimestamp="2025-10-06 09:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:04:46.565994119 +0000 UTC m=+1418.281047517" watchObservedRunningTime="2025-10-06 09:04:46.569459451 +0000 UTC m=+1418.284512839" Oct 06 09:04:47 crc kubenswrapper[4610]: I1006 09:04:47.547771 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerStarted","Data":"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c"} Oct 06 09:04:47 crc kubenswrapper[4610]: I1006 09:04:47.573257 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2d799" podStartSLOduration=3.038348689 podStartE2EDuration="6.573232266s" podCreationTimestamp="2025-10-06 09:04:41 +0000 UTC" firstStartedPulling="2025-10-06 09:04:43.484502311 +0000 UTC m=+1415.199555719" lastFinishedPulling="2025-10-06 09:04:47.019385918 +0000 UTC m=+1418.734439296" observedRunningTime="2025-10-06 09:04:47.564064462 +0000 UTC m=+1419.279117880" watchObservedRunningTime="2025-10-06 09:04:47.573232266 +0000 UTC m=+1419.288285654" Oct 06 09:04:50 crc kubenswrapper[4610]: I1006 09:04:50.343635 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zzfqg" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" probeResult="failure" output=< Oct 06 09:04:50 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:04:50 crc kubenswrapper[4610]: > Oct 06 09:04:51 crc kubenswrapper[4610]: I1006 09:04:51.766229 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:51 crc kubenswrapper[4610]: I1006 09:04:51.766488 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:51 crc kubenswrapper[4610]: I1006 09:04:51.857217 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:52 crc kubenswrapper[4610]: I1006 09:04:52.659145 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:52 crc kubenswrapper[4610]: I1006 09:04:52.726161 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:53 crc kubenswrapper[4610]: I1006 09:04:53.956281 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.025389 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.025621 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="dnsmasq-dns" containerID="cri-o://5dfa7208bf947a680ecb804f43446466bbb58ce3891c43cb41255957abb288a0" gracePeriod=10 Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.249979 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667c9c995c-dfhd5"] Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.251820 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.257782 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667c9c995c-dfhd5"] Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.294207 4610 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.203:5353: connect: connection refused" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329515 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329592 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-config\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329650 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329670 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-svc\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329704 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329722 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.329761 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5pf\" (UniqueName: \"kubernetes.io/projected/2228f60d-4cb6-43a2-9259-848d7353ad4b-kube-api-access-xs5pf\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.431648 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432006 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-svc\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432064 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432094 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432154 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5pf\" (UniqueName: \"kubernetes.io/projected/2228f60d-4cb6-43a2-9259-848d7353ad4b-kube-api-access-xs5pf\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432270 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432319 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-config\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.432912 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.433495 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-config\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.433678 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.434016 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.434433 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-dns-svc\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.434500 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2228f60d-4cb6-43a2-9259-848d7353ad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.491349 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5pf\" (UniqueName: \"kubernetes.io/projected/2228f60d-4cb6-43a2-9259-848d7353ad4b-kube-api-access-xs5pf\") pod \"dnsmasq-dns-667c9c995c-dfhd5\" (UID: \"2228f60d-4cb6-43a2-9259-848d7353ad4b\") " pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.573717 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.617374 4610 generic.go:334] "Generic (PLEG): container finished" podID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerID="5dfa7208bf947a680ecb804f43446466bbb58ce3891c43cb41255957abb288a0" exitCode=0 Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.617589 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2d799" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="registry-server" containerID="cri-o://82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c" gracePeriod=2 Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.617839 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" event={"ID":"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94","Type":"ContainerDied","Data":"5dfa7208bf947a680ecb804f43446466bbb58ce3891c43cb41255957abb288a0"} Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.617870 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" event={"ID":"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94","Type":"ContainerDied","Data":"b0b761c7b51302102a18ced5c7900074217e38783e89ad0010ba5a726f1efcca"} Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.617880 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b761c7b51302102a18ced5c7900074217e38783e89ad0010ba5a726f1efcca" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.797874 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.946933 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.947000 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.947033 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.947130 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf6fw\" (UniqueName: \"kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.947206 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.948098 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb\") pod \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\" (UID: \"e898ec76-5bfb-4dd2-a007-d96f6a2e5d94\") " Oct 06 09:04:54 crc kubenswrapper[4610]: I1006 09:04:54.953167 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw" (OuterVolumeSpecName: "kube-api-access-rf6fw") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "kube-api-access-rf6fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.011570 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.013708 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.015750 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.029302 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config" (OuterVolumeSpecName: "config") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.037621 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" (UID: "e898ec76-5bfb-4dd2-a007-d96f6a2e5d94"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050315 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050345 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050356 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050366 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050376 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.050384 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf6fw\" (UniqueName: \"kubernetes.io/projected/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94-kube-api-access-rf6fw\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: W1006 09:04:55.088538 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2228f60d_4cb6_43a2_9259_848d7353ad4b.slice/crio-44ef443ea6aa2ef73f9ab9b3ddd3901add11ab6fb0e5b6bdba222660e20c5922 WatchSource:0}: Error finding container 44ef443ea6aa2ef73f9ab9b3ddd3901add11ab6fb0e5b6bdba222660e20c5922: Status 404 returned error can't find the container with id 44ef443ea6aa2ef73f9ab9b3ddd3901add11ab6fb0e5b6bdba222660e20c5922 Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.088889 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667c9c995c-dfhd5"] Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.266316 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.356359 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities\") pod \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.357019 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g62s\" (UniqueName: \"kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s\") pod \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.357171 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content\") pod \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\" (UID: \"4c65756c-04f9-4ee3-90fb-e57fcff2150b\") " Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.357233 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities" (OuterVolumeSpecName: "utilities") pod "4c65756c-04f9-4ee3-90fb-e57fcff2150b" (UID: "4c65756c-04f9-4ee3-90fb-e57fcff2150b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.357983 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.360178 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s" (OuterVolumeSpecName: "kube-api-access-7g62s") pod "4c65756c-04f9-4ee3-90fb-e57fcff2150b" (UID: "4c65756c-04f9-4ee3-90fb-e57fcff2150b"). InnerVolumeSpecName "kube-api-access-7g62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.418002 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c65756c-04f9-4ee3-90fb-e57fcff2150b" (UID: "4c65756c-04f9-4ee3-90fb-e57fcff2150b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.459486 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g62s\" (UniqueName: \"kubernetes.io/projected/4c65756c-04f9-4ee3-90fb-e57fcff2150b-kube-api-access-7g62s\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.459520 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c65756c-04f9-4ee3-90fb-e57fcff2150b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.628311 4610 generic.go:334] "Generic (PLEG): container finished" podID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerID="82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c" exitCode=0 Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.628551 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d799" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.628604 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerDied","Data":"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c"} Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.628632 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d799" event={"ID":"4c65756c-04f9-4ee3-90fb-e57fcff2150b","Type":"ContainerDied","Data":"1864650a4f4a20736888add8df8d9fdcc8ac648cd0e39a4b435835343d422f19"} Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.628651 4610 scope.go:117] "RemoveContainer" containerID="82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.636722 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-nm5v7" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.636766 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" event={"ID":"2228f60d-4cb6-43a2-9259-848d7353ad4b","Type":"ContainerStarted","Data":"9c42d5f00297d206866786d871ba77b362d98c9a7ce2bb9c99148946bb0c3feb"} Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.636831 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" event={"ID":"2228f60d-4cb6-43a2-9259-848d7353ad4b","Type":"ContainerStarted","Data":"44ef443ea6aa2ef73f9ab9b3ddd3901add11ab6fb0e5b6bdba222660e20c5922"} Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.756261 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.756334 4610 scope.go:117] "RemoveContainer" containerID="3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.764900 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-nm5v7"] Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.774205 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.784140 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2d799"] Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.795186 4610 scope.go:117] "RemoveContainer" containerID="ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.817912 4610 scope.go:117] "RemoveContainer" containerID="82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c" Oct 06 09:04:55 crc kubenswrapper[4610]: E1006 09:04:55.818597 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c\": container with ID starting with 82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c not found: ID does not exist" containerID="82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.818645 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c"} err="failed to get container status \"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c\": rpc error: code = NotFound desc = could not find container \"82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c\": container with ID starting with 82d1d38fe97a32a9eb1d896e18bb0e79927187a2dd62747150f1e58dcbd9377c not found: ID does not exist" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.818677 4610 scope.go:117] "RemoveContainer" containerID="3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753" Oct 06 09:04:55 crc kubenswrapper[4610]: E1006 09:04:55.819193 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753\": container with ID starting with 3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753 not found: ID does not exist" containerID="3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.819217 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753"} err="failed to get container status \"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753\": rpc error: code = NotFound desc = could not find container \"3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753\": container with ID starting with 3d31aa40870fb4fa3b2c710f6a02bc54ae2518b617ee4e4d48e7350289785753 not found: ID does not exist" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.819231 4610 scope.go:117] "RemoveContainer" containerID="ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c" Oct 06 09:04:55 crc kubenswrapper[4610]: E1006 09:04:55.819506 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c\": container with ID starting with ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c not found: ID does not exist" containerID="ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c" Oct 06 09:04:55 crc kubenswrapper[4610]: I1006 09:04:55.819535 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c"} err="failed to get container status \"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c\": rpc error: code = NotFound desc = could not find container \"ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c\": container with ID starting with ce12ed4ffdbdcec94d30c2af6cd28cb611a141f253086481699ec98609a97a5c not found: ID does not exist" Oct 06 09:04:56 crc kubenswrapper[4610]: I1006 09:04:56.646395 4610 generic.go:334] "Generic (PLEG): container finished" podID="2228f60d-4cb6-43a2-9259-848d7353ad4b" containerID="9c42d5f00297d206866786d871ba77b362d98c9a7ce2bb9c99148946bb0c3feb" exitCode=0 Oct 06 09:04:56 crc kubenswrapper[4610]: I1006 09:04:56.646452 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" event={"ID":"2228f60d-4cb6-43a2-9259-848d7353ad4b","Type":"ContainerDied","Data":"9c42d5f00297d206866786d871ba77b362d98c9a7ce2bb9c99148946bb0c3feb"} Oct 06 09:04:56 crc kubenswrapper[4610]: I1006 09:04:56.646766 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:04:56 crc kubenswrapper[4610]: I1006 09:04:56.646782 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" event={"ID":"2228f60d-4cb6-43a2-9259-848d7353ad4b","Type":"ContainerStarted","Data":"f65d9e8f42b22a36549a2002156fa0213395502ca9fb6b23a64f659e6e741687"} Oct 06 09:04:56 crc kubenswrapper[4610]: I1006 09:04:56.674999 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" podStartSLOduration=2.6749769370000003 podStartE2EDuration="2.674976937s" podCreationTimestamp="2025-10-06 09:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:04:56.667565941 +0000 UTC m=+1428.382619349" watchObservedRunningTime="2025-10-06 09:04:56.674976937 +0000 UTC m=+1428.390030345" Oct 06 09:04:57 crc kubenswrapper[4610]: I1006 09:04:57.081300 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" path="/var/lib/kubelet/pods/4c65756c-04f9-4ee3-90fb-e57fcff2150b/volumes" Oct 06 09:04:57 crc kubenswrapper[4610]: I1006 09:04:57.082115 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" path="/var/lib/kubelet/pods/e898ec76-5bfb-4dd2-a007-d96f6a2e5d94/volumes" Oct 06 09:04:59 crc kubenswrapper[4610]: I1006 09:04:59.364600 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:59 crc kubenswrapper[4610]: I1006 09:04:59.432790 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:04:59 crc kubenswrapper[4610]: I1006 09:04:59.605200 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:05:00 crc kubenswrapper[4610]: I1006 09:05:00.692482 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zzfqg" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" containerID="cri-o://2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18" gracePeriod=2 Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.184432 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.264604 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities\") pod \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.264660 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content\") pod \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.264730 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmjq7\" (UniqueName: \"kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7\") pod \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\" (UID: \"76d8bd36-f193-4dec-9e26-d41537c5c6dc\") " Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.265866 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities" (OuterVolumeSpecName: "utilities") pod "76d8bd36-f193-4dec-9e26-d41537c5c6dc" (UID: "76d8bd36-f193-4dec-9e26-d41537c5c6dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.270313 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7" (OuterVolumeSpecName: "kube-api-access-tmjq7") pod "76d8bd36-f193-4dec-9e26-d41537c5c6dc" (UID: "76d8bd36-f193-4dec-9e26-d41537c5c6dc"). InnerVolumeSpecName "kube-api-access-tmjq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.336083 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76d8bd36-f193-4dec-9e26-d41537c5c6dc" (UID: "76d8bd36-f193-4dec-9e26-d41537c5c6dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.367243 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.367439 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d8bd36-f193-4dec-9e26-d41537c5c6dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.367528 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmjq7\" (UniqueName: \"kubernetes.io/projected/76d8bd36-f193-4dec-9e26-d41537c5c6dc-kube-api-access-tmjq7\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.704147 4610 generic.go:334] "Generic (PLEG): container finished" podID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerID="2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18" exitCode=0 Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.704211 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzfqg" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.704212 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerDied","Data":"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18"} Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.705725 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzfqg" event={"ID":"76d8bd36-f193-4dec-9e26-d41537c5c6dc","Type":"ContainerDied","Data":"012fbdb9dd7b28d6149175dcc1901b55d488222ef04b23041ddb6070caa5d0b6"} Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.705746 4610 scope.go:117] "RemoveContainer" containerID="2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.740432 4610 scope.go:117] "RemoveContainer" containerID="f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.766153 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.773908 4610 scope.go:117] "RemoveContainer" containerID="787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.774493 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zzfqg"] Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.813056 4610 scope.go:117] "RemoveContainer" containerID="2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18" Oct 06 09:05:01 crc kubenswrapper[4610]: E1006 09:05:01.813515 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18\": container with ID starting with 2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18 not found: ID does not exist" containerID="2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.813551 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18"} err="failed to get container status \"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18\": rpc error: code = NotFound desc = could not find container \"2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18\": container with ID starting with 2d05390d0ac55a51ea1f6556ad409e2586ccc837313b8c33e61faadaca4f4b18 not found: ID does not exist" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.813596 4610 scope.go:117] "RemoveContainer" containerID="f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256" Oct 06 09:05:01 crc kubenswrapper[4610]: E1006 09:05:01.814133 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256\": container with ID starting with f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256 not found: ID does not exist" containerID="f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.814237 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256"} err="failed to get container status \"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256\": rpc error: code = NotFound desc = could not find container \"f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256\": container with ID starting with f896ff240c10f80841d3bfc140deb59f7f52688d6eb9c2373c0d92c5a60b0256 not found: ID does not exist" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.814333 4610 scope.go:117] "RemoveContainer" containerID="787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530" Oct 06 09:05:01 crc kubenswrapper[4610]: E1006 09:05:01.814799 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530\": container with ID starting with 787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530 not found: ID does not exist" containerID="787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530" Oct 06 09:05:01 crc kubenswrapper[4610]: I1006 09:05:01.814841 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530"} err="failed to get container status \"787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530\": rpc error: code = NotFound desc = could not find container \"787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530\": container with ID starting with 787b2f4e7236c465b007cbbe5ec9897cd044108291f745b73cf0d5c82e918530 not found: ID does not exist" Oct 06 09:05:03 crc kubenswrapper[4610]: I1006 09:05:03.084373 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" path="/var/lib/kubelet/pods/76d8bd36-f193-4dec-9e26-d41537c5c6dc/volumes" Oct 06 09:05:04 crc kubenswrapper[4610]: I1006 09:05:04.575255 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-667c9c995c-dfhd5" Oct 06 09:05:04 crc kubenswrapper[4610]: I1006 09:05:04.679733 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:05:04 crc kubenswrapper[4610]: I1006 09:05:04.680013 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="dnsmasq-dns" containerID="cri-o://721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce" gracePeriod=10 Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.699459 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.745931 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.747035 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.747190 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.748123 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.748336 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbwj2\" (UniqueName: \"kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.748439 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.748538 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb\") pod \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\" (UID: \"2c61f8a2-8bb2-4f82-a997-42d6218b0e76\") " Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.776245 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2" (OuterVolumeSpecName: "kube-api-access-sbwj2") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "kube-api-access-sbwj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.783433 4610 generic.go:334] "Generic (PLEG): container finished" podID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerID="721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce" exitCode=0 Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.783473 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" event={"ID":"2c61f8a2-8bb2-4f82-a997-42d6218b0e76","Type":"ContainerDied","Data":"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce"} Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.783500 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" event={"ID":"2c61f8a2-8bb2-4f82-a997-42d6218b0e76","Type":"ContainerDied","Data":"f2a135156e6fc688fa3e51cce14dca70f63073248aa9e4b753e8d5303c5cd742"} Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.783517 4610 scope.go:117] "RemoveContainer" containerID="721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.783646 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-r8l6z" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.842267 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config" (OuterVolumeSpecName: "config") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.846207 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.850963 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbwj2\" (UniqueName: \"kubernetes.io/projected/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-kube-api-access-sbwj2\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.850990 4610 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-config\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.851000 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.879695 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.883897 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.896763 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.911620 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c61f8a2-8bb2-4f82-a997-42d6218b0e76" (UID: "2c61f8a2-8bb2-4f82-a997-42d6218b0e76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.952175 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.952211 4610 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.952221 4610 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.952228 4610 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c61f8a2-8bb2-4f82-a997-42d6218b0e76-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:05 crc kubenswrapper[4610]: I1006 09:05:05.984365 4610 scope.go:117] "RemoveContainer" containerID="827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d" Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.004948 4610 scope.go:117] "RemoveContainer" containerID="721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce" Oct 06 09:05:06 crc kubenswrapper[4610]: E1006 09:05:06.005407 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce\": container with ID starting with 721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce not found: ID does not exist" containerID="721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce" Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.005479 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce"} err="failed to get container status \"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce\": rpc error: code = NotFound desc = could not find container \"721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce\": container with ID starting with 721a5391d0596e0ae2ae90d235d6420f37fe2fbb1a697f18f89c2dbd4b1aedce not found: ID does not exist" Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.005535 4610 scope.go:117] "RemoveContainer" containerID="827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d" Oct 06 09:05:06 crc kubenswrapper[4610]: E1006 09:05:06.005890 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d\": container with ID starting with 827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d not found: ID does not exist" containerID="827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d" Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.005938 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d"} err="failed to get container status \"827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d\": rpc error: code = NotFound desc = could not find container \"827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d\": container with ID starting with 827765368ec17bf82da252c2d578c78f03a31c09846e8dc5d7715cf9687bce8d not found: ID does not exist" Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.126930 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:05:06 crc kubenswrapper[4610]: I1006 09:05:06.137154 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-r8l6z"] Oct 06 09:05:07 crc kubenswrapper[4610]: I1006 09:05:07.083569 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" path="/var/lib/kubelet/pods/2c61f8a2-8bb2-4f82-a997-42d6218b0e76/volumes" Oct 06 09:05:11 crc kubenswrapper[4610]: I1006 09:05:11.529453 4610 scope.go:117] "RemoveContainer" containerID="8600434c25fa68074174302fe7241a279af41c963bcecbf68d85006435562bd5" Oct 06 09:05:15 crc kubenswrapper[4610]: I1006 09:05:15.886193 4610 generic.go:334] "Generic (PLEG): container finished" podID="099c0f32-ad2c-4b69-a308-f46f3dbab2be" containerID="07b3d9fabae0e21ef224106dbcb4e4ca4042b18c4c9f4bcfae2034459e2b8726" exitCode=0 Oct 06 09:05:15 crc kubenswrapper[4610]: I1006 09:05:15.886304 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"099c0f32-ad2c-4b69-a308-f46f3dbab2be","Type":"ContainerDied","Data":"07b3d9fabae0e21ef224106dbcb4e4ca4042b18c4c9f4bcfae2034459e2b8726"} Oct 06 09:05:15 crc kubenswrapper[4610]: I1006 09:05:15.890312 4610 generic.go:334] "Generic (PLEG): container finished" podID="060bb971-d347-44c3-b9ce-6c06c13bcb51" containerID="b955ec40defc5de4c039348ac943d9c5c34d829221d1d667c0b99c1c5ac2dd7d" exitCode=0 Oct 06 09:05:15 crc kubenswrapper[4610]: I1006 09:05:15.890361 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"060bb971-d347-44c3-b9ce-6c06c13bcb51","Type":"ContainerDied","Data":"b955ec40defc5de4c039348ac943d9c5c34d829221d1d667c0b99c1c5ac2dd7d"} Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.899792 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"099c0f32-ad2c-4b69-a308-f46f3dbab2be","Type":"ContainerStarted","Data":"9d5dd14a11cac5a13533ba2a8e7d66aa0fa32b09e92ee70f86bc7a5a1e68771e"} Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.901300 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.903819 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"060bb971-d347-44c3-b9ce-6c06c13bcb51","Type":"ContainerStarted","Data":"78df201aabe7c6424de964848f42b869cf079c8dfbf2b2e5a6f5f25043e3d38b"} Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.904301 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.932264 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.932244126 podStartE2EDuration="36.932244126s" podCreationTimestamp="2025-10-06 09:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:05:16.92632881 +0000 UTC m=+1448.641382218" watchObservedRunningTime="2025-10-06 09:05:16.932244126 +0000 UTC m=+1448.647297514" Oct 06 09:05:16 crc kubenswrapper[4610]: I1006 09:05:16.951851 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.951833804 podStartE2EDuration="36.951833804s" podCreationTimestamp="2025-10-06 09:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:05:16.945735853 +0000 UTC m=+1448.660789261" watchObservedRunningTime="2025-10-06 09:05:16.951833804 +0000 UTC m=+1448.666887192" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.521509 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522559 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="extract-content" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522577 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="extract-content" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522596 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522604 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522621 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522630 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522656 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522663 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522671 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="extract-utilities" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522678 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="extract-utilities" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522690 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="extract-content" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522696 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="extract-content" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522715 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="init" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522723 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="init" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522738 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="init" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522746 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="init" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522759 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="extract-utilities" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522768 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="extract-utilities" Oct 06 09:05:23 crc kubenswrapper[4610]: E1006 09:05:23.522808 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.522816 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.523056 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e898ec76-5bfb-4dd2-a007-d96f6a2e5d94" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.523084 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c61f8a2-8bb2-4f82-a997-42d6218b0e76" containerName="dnsmasq-dns" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.523097 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d8bd36-f193-4dec-9e26-d41537c5c6dc" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.523109 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c65756c-04f9-4ee3-90fb-e57fcff2150b" containerName="registry-server" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.524825 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.574753 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.660632 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.660714 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmb8k\" (UniqueName: \"kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.660779 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.761967 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmb8k\" (UniqueName: \"kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.762029 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.762252 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.762723 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.762717 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.787436 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmb8k\" (UniqueName: \"kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k\") pod \"redhat-marketplace-2w8bn\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:23 crc kubenswrapper[4610]: I1006 09:05:23.845455 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:24 crc kubenswrapper[4610]: I1006 09:05:24.412728 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:25 crc kubenswrapper[4610]: I1006 09:05:25.022228 4610 generic.go:334] "Generic (PLEG): container finished" podID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerID="19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b" exitCode=0 Oct 06 09:05:25 crc kubenswrapper[4610]: I1006 09:05:25.022336 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerDied","Data":"19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b"} Oct 06 09:05:25 crc kubenswrapper[4610]: I1006 09:05:25.022483 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerStarted","Data":"88f7fa952483fddfa716ad7a2f41191a413e807c57a0e3c31e0570e7b294dd6b"} Oct 06 09:05:26 crc kubenswrapper[4610]: I1006 09:05:26.031764 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerStarted","Data":"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812"} Oct 06 09:05:27 crc kubenswrapper[4610]: I1006 09:05:27.043251 4610 generic.go:334] "Generic (PLEG): container finished" podID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerID="4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812" exitCode=0 Oct 06 09:05:27 crc kubenswrapper[4610]: I1006 09:05:27.043292 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerDied","Data":"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812"} Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.054616 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerStarted","Data":"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e"} Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.111573 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2w8bn" podStartSLOduration=2.597975861 podStartE2EDuration="5.111555262s" podCreationTimestamp="2025-10-06 09:05:23 +0000 UTC" firstStartedPulling="2025-10-06 09:05:25.024984499 +0000 UTC m=+1456.740037887" lastFinishedPulling="2025-10-06 09:05:27.5385639 +0000 UTC m=+1459.253617288" observedRunningTime="2025-10-06 09:05:28.103232212 +0000 UTC m=+1459.818285630" watchObservedRunningTime="2025-10-06 09:05:28.111555262 +0000 UTC m=+1459.826608650" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.244338 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9"] Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.251534 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.253935 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.256057 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.257335 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.265030 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.329999 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9"] Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.351529 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d67p\" (UniqueName: \"kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.351700 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.351904 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.352126 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.453961 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.454073 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.454115 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d67p\" (UniqueName: \"kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.454141 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.459475 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.465655 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.471416 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.475826 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d67p\" (UniqueName: \"kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:28 crc kubenswrapper[4610]: I1006 09:05:28.570293 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:29 crc kubenswrapper[4610]: I1006 09:05:29.534023 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9"] Oct 06 09:05:30 crc kubenswrapper[4610]: I1006 09:05:30.074536 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" event={"ID":"f70ce47b-f642-41e9-8649-7dc466c07c27","Type":"ContainerStarted","Data":"f98b9083747b6eb0bf2ef60d207927e0fe6e8d56f12d47a3d3fcec288fc1c373"} Oct 06 09:05:31 crc kubenswrapper[4610]: I1006 09:05:31.175236 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 09:05:31 crc kubenswrapper[4610]: I1006 09:05:31.181558 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 09:05:33 crc kubenswrapper[4610]: I1006 09:05:33.846342 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:33 crc kubenswrapper[4610]: I1006 09:05:33.846638 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:33 crc kubenswrapper[4610]: I1006 09:05:33.906285 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:34 crc kubenswrapper[4610]: I1006 09:05:34.175004 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:34 crc kubenswrapper[4610]: I1006 09:05:34.239076 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.146029 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2w8bn" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="registry-server" containerID="cri-o://76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e" gracePeriod=2 Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.655993 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.761868 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmb8k\" (UniqueName: \"kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k\") pod \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.761926 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities\") pod \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.761948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content\") pod \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\" (UID: \"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda\") " Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.762513 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities" (OuterVolumeSpecName: "utilities") pod "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" (UID: "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.773128 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" (UID: "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.783351 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k" (OuterVolumeSpecName: "kube-api-access-bmb8k") pod "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" (UID: "c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda"). InnerVolumeSpecName "kube-api-access-bmb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.864277 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmb8k\" (UniqueName: \"kubernetes.io/projected/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-kube-api-access-bmb8k\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.864310 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:36 crc kubenswrapper[4610]: I1006 09:05:36.864321 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.158605 4610 generic.go:334] "Generic (PLEG): container finished" podID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerID="76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e" exitCode=0 Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.158646 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerDied","Data":"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e"} Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.158674 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w8bn" event={"ID":"c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda","Type":"ContainerDied","Data":"88f7fa952483fddfa716ad7a2f41191a413e807c57a0e3c31e0570e7b294dd6b"} Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.158692 4610 scope.go:117] "RemoveContainer" containerID="76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e" Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.158810 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w8bn" Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.192490 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:37 crc kubenswrapper[4610]: I1006 09:05:37.203506 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w8bn"] Oct 06 09:05:39 crc kubenswrapper[4610]: I1006 09:05:39.091319 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" path="/var/lib/kubelet/pods/c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda/volumes" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.249099 4610 scope.go:117] "RemoveContainer" containerID="4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.401655 4610 scope.go:117] "RemoveContainer" containerID="19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.758065 4610 scope.go:117] "RemoveContainer" containerID="76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e" Oct 06 09:05:43 crc kubenswrapper[4610]: E1006 09:05:43.759127 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e\": container with ID starting with 76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e not found: ID does not exist" containerID="76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.759160 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e"} err="failed to get container status \"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e\": rpc error: code = NotFound desc = could not find container \"76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e\": container with ID starting with 76b0f72e854da18e31ecd97d31abbb8176c126689535eb02855a974dbb354d1e not found: ID does not exist" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.759185 4610 scope.go:117] "RemoveContainer" containerID="4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812" Oct 06 09:05:43 crc kubenswrapper[4610]: E1006 09:05:43.759662 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812\": container with ID starting with 4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812 not found: ID does not exist" containerID="4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.759692 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812"} err="failed to get container status \"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812\": rpc error: code = NotFound desc = could not find container \"4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812\": container with ID starting with 4f541a785c36f8bc681d5f4873ad4b872d184b40160deed5e19afbcc69a54812 not found: ID does not exist" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.759709 4610 scope.go:117] "RemoveContainer" containerID="19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b" Oct 06 09:05:43 crc kubenswrapper[4610]: E1006 09:05:43.760465 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b\": container with ID starting with 19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b not found: ID does not exist" containerID="19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b" Oct 06 09:05:43 crc kubenswrapper[4610]: I1006 09:05:43.760532 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b"} err="failed to get container status \"19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b\": rpc error: code = NotFound desc = could not find container \"19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b\": container with ID starting with 19c4b7c20c071abda42dc14139a78f65cfded54f685fe0a7ceaf15407a16481b not found: ID does not exist" Oct 06 09:05:44 crc kubenswrapper[4610]: I1006 09:05:44.289296 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" event={"ID":"f70ce47b-f642-41e9-8649-7dc466c07c27","Type":"ContainerStarted","Data":"0f95110ddfac567135006c58ce99819ecd367e0f32a0739940205b1bea6e260e"} Oct 06 09:05:44 crc kubenswrapper[4610]: I1006 09:05:44.311625 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" podStartSLOduration=1.8991636490000001 podStartE2EDuration="16.311608431s" podCreationTimestamp="2025-10-06 09:05:28 +0000 UTC" firstStartedPulling="2025-10-06 09:05:29.538826878 +0000 UTC m=+1461.253880276" lastFinishedPulling="2025-10-06 09:05:43.95127167 +0000 UTC m=+1475.666325058" observedRunningTime="2025-10-06 09:05:44.305933501 +0000 UTC m=+1476.020986889" watchObservedRunningTime="2025-10-06 09:05:44.311608431 +0000 UTC m=+1476.026661819" Oct 06 09:05:57 crc kubenswrapper[4610]: I1006 09:05:57.426372 4610 generic.go:334] "Generic (PLEG): container finished" podID="f70ce47b-f642-41e9-8649-7dc466c07c27" containerID="0f95110ddfac567135006c58ce99819ecd367e0f32a0739940205b1bea6e260e" exitCode=0 Oct 06 09:05:57 crc kubenswrapper[4610]: I1006 09:05:57.426569 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" event={"ID":"f70ce47b-f642-41e9-8649-7dc466c07c27","Type":"ContainerDied","Data":"0f95110ddfac567135006c58ce99819ecd367e0f32a0739940205b1bea6e260e"} Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.002315 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.101802 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory\") pod \"f70ce47b-f642-41e9-8649-7dc466c07c27\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.101915 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle\") pod \"f70ce47b-f642-41e9-8649-7dc466c07c27\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.101959 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d67p\" (UniqueName: \"kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p\") pod \"f70ce47b-f642-41e9-8649-7dc466c07c27\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.102001 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key\") pod \"f70ce47b-f642-41e9-8649-7dc466c07c27\" (UID: \"f70ce47b-f642-41e9-8649-7dc466c07c27\") " Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.111246 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p" (OuterVolumeSpecName: "kube-api-access-6d67p") pod "f70ce47b-f642-41e9-8649-7dc466c07c27" (UID: "f70ce47b-f642-41e9-8649-7dc466c07c27"). InnerVolumeSpecName "kube-api-access-6d67p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.124412 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f70ce47b-f642-41e9-8649-7dc466c07c27" (UID: "f70ce47b-f642-41e9-8649-7dc466c07c27"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.176180 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f70ce47b-f642-41e9-8649-7dc466c07c27" (UID: "f70ce47b-f642-41e9-8649-7dc466c07c27"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.183611 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory" (OuterVolumeSpecName: "inventory") pod "f70ce47b-f642-41e9-8649-7dc466c07c27" (UID: "f70ce47b-f642-41e9-8649-7dc466c07c27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.203483 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.203524 4610 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.203535 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d67p\" (UniqueName: \"kubernetes.io/projected/f70ce47b-f642-41e9-8649-7dc466c07c27-kube-api-access-6d67p\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.203544 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ce47b-f642-41e9-8649-7dc466c07c27-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.445408 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" event={"ID":"f70ce47b-f642-41e9-8649-7dc466c07c27","Type":"ContainerDied","Data":"f98b9083747b6eb0bf2ef60d207927e0fe6e8d56f12d47a3d3fcec288fc1c373"} Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.445443 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98b9083747b6eb0bf2ef60d207927e0fe6e8d56f12d47a3d3fcec288fc1c373" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.445467 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534031 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt"] Oct 06 09:05:59 crc kubenswrapper[4610]: E1006 09:05:59.534422 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="extract-content" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534433 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="extract-content" Oct 06 09:05:59 crc kubenswrapper[4610]: E1006 09:05:59.534453 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="extract-utilities" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534459 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="extract-utilities" Oct 06 09:05:59 crc kubenswrapper[4610]: E1006 09:05:59.534466 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70ce47b-f642-41e9-8649-7dc466c07c27" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534473 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70ce47b-f642-41e9-8649-7dc466c07c27" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 09:05:59 crc kubenswrapper[4610]: E1006 09:05:59.534492 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="registry-server" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534497 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="registry-server" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534653 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a1ef94-d1f4-4c26-b80c-77d37d8d2dda" containerName="registry-server" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.534676 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70ce47b-f642-41e9-8649-7dc466c07c27" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.535394 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.540209 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.540430 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.540573 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.540928 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.603952 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt"] Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.608556 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxh5\" (UniqueName: \"kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.608649 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.608694 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.709910 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxh5\" (UniqueName: \"kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.710000 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.710087 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.714032 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.714230 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.732531 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxh5\" (UniqueName: \"kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2dstt\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:05:59 crc kubenswrapper[4610]: I1006 09:05:59.861712 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:06:00 crc kubenswrapper[4610]: I1006 09:06:00.373220 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt"] Oct 06 09:06:00 crc kubenswrapper[4610]: I1006 09:06:00.457422 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" event={"ID":"4dd4792e-84d5-41ff-bc84-b3d0bde5377a","Type":"ContainerStarted","Data":"0a0e7b06602ea6c4f043ce442efa72a5ededea8c1964d0574e91211461d8147d"} Oct 06 09:06:01 crc kubenswrapper[4610]: I1006 09:06:01.467189 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" event={"ID":"4dd4792e-84d5-41ff-bc84-b3d0bde5377a","Type":"ContainerStarted","Data":"f3f792b45f27c67ff2426bb816fd2de6dc45084074a91010daac2714de72e3d5"} Oct 06 09:06:01 crc kubenswrapper[4610]: I1006 09:06:01.492877 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" podStartSLOduration=2.29171806 podStartE2EDuration="2.492855155s" podCreationTimestamp="2025-10-06 09:05:59 +0000 UTC" firstStartedPulling="2025-10-06 09:06:00.385700168 +0000 UTC m=+1492.100753566" lastFinishedPulling="2025-10-06 09:06:00.586837273 +0000 UTC m=+1492.301890661" observedRunningTime="2025-10-06 09:06:01.483634512 +0000 UTC m=+1493.198687910" watchObservedRunningTime="2025-10-06 09:06:01.492855155 +0000 UTC m=+1493.207908543" Oct 06 09:06:03 crc kubenswrapper[4610]: I1006 09:06:03.487100 4610 generic.go:334] "Generic (PLEG): container finished" podID="4dd4792e-84d5-41ff-bc84-b3d0bde5377a" containerID="f3f792b45f27c67ff2426bb816fd2de6dc45084074a91010daac2714de72e3d5" exitCode=0 Oct 06 09:06:03 crc kubenswrapper[4610]: I1006 09:06:03.487167 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" event={"ID":"4dd4792e-84d5-41ff-bc84-b3d0bde5377a","Type":"ContainerDied","Data":"f3f792b45f27c67ff2426bb816fd2de6dc45084074a91010daac2714de72e3d5"} Oct 06 09:06:04 crc kubenswrapper[4610]: I1006 09:06:04.958916 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.124231 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory\") pod \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.124320 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsxh5\" (UniqueName: \"kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5\") pod \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.124443 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key\") pod \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\" (UID: \"4dd4792e-84d5-41ff-bc84-b3d0bde5377a\") " Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.130964 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5" (OuterVolumeSpecName: "kube-api-access-wsxh5") pod "4dd4792e-84d5-41ff-bc84-b3d0bde5377a" (UID: "4dd4792e-84d5-41ff-bc84-b3d0bde5377a"). InnerVolumeSpecName "kube-api-access-wsxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.161805 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4dd4792e-84d5-41ff-bc84-b3d0bde5377a" (UID: "4dd4792e-84d5-41ff-bc84-b3d0bde5377a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.166001 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory" (OuterVolumeSpecName: "inventory") pod "4dd4792e-84d5-41ff-bc84-b3d0bde5377a" (UID: "4dd4792e-84d5-41ff-bc84-b3d0bde5377a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.227159 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.227193 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.227211 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsxh5\" (UniqueName: \"kubernetes.io/projected/4dd4792e-84d5-41ff-bc84-b3d0bde5377a-kube-api-access-wsxh5\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.510164 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" event={"ID":"4dd4792e-84d5-41ff-bc84-b3d0bde5377a","Type":"ContainerDied","Data":"0a0e7b06602ea6c4f043ce442efa72a5ededea8c1964d0574e91211461d8147d"} Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.510201 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a0e7b06602ea6c4f043ce442efa72a5ededea8c1964d0574e91211461d8147d" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.510236 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2dstt" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.593553 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv"] Oct 06 09:06:05 crc kubenswrapper[4610]: E1006 09:06:05.594590 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd4792e-84d5-41ff-bc84-b3d0bde5377a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.594615 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd4792e-84d5-41ff-bc84-b3d0bde5377a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.594900 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd4792e-84d5-41ff-bc84-b3d0bde5377a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.596016 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.603582 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.603632 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.604348 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.606817 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.606969 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv"] Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.737235 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.737618 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tjl\" (UniqueName: \"kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.737714 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.737779 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.840250 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tjl\" (UniqueName: \"kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.840445 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.841404 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.841716 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.844596 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.846824 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.848213 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.859257 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tjl\" (UniqueName: \"kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:05 crc kubenswrapper[4610]: I1006 09:06:05.920593 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:06:06 crc kubenswrapper[4610]: I1006 09:06:06.491257 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv"] Oct 06 09:06:06 crc kubenswrapper[4610]: I1006 09:06:06.519784 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" event={"ID":"290e102b-3121-4f44-b861-2b2e2e297f7b","Type":"ContainerStarted","Data":"737afbe7717e45291d2841b39ab2f8725c0feb233c5cc6934facf40fe815cf72"} Oct 06 09:06:07 crc kubenswrapper[4610]: I1006 09:06:07.532253 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" event={"ID":"290e102b-3121-4f44-b861-2b2e2e297f7b","Type":"ContainerStarted","Data":"2a33934bdbd9c72ec9b9eed91c5675e967ed094c25d019374d579ff7b59cbb28"} Oct 06 09:06:07 crc kubenswrapper[4610]: I1006 09:06:07.559119 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" podStartSLOduration=1.942066911 podStartE2EDuration="2.559093776s" podCreationTimestamp="2025-10-06 09:06:05 +0000 UTC" firstStartedPulling="2025-10-06 09:06:06.504312043 +0000 UTC m=+1498.219365431" lastFinishedPulling="2025-10-06 09:06:07.121338908 +0000 UTC m=+1498.836392296" observedRunningTime="2025-10-06 09:06:07.551512076 +0000 UTC m=+1499.266565484" watchObservedRunningTime="2025-10-06 09:06:07.559093776 +0000 UTC m=+1499.274147164" Oct 06 09:06:11 crc kubenswrapper[4610]: I1006 09:06:11.983075 4610 scope.go:117] "RemoveContainer" containerID="37f7457232ba5abeb6c1eda06e9027474218e6abea335caccd389835d4d822b7" Oct 06 09:06:12 crc kubenswrapper[4610]: I1006 09:06:12.098407 4610 scope.go:117] "RemoveContainer" containerID="05cd332531da528c7c4b53c3febbebd690c03cf920eb818592abea5271832907" Oct 06 09:06:12 crc kubenswrapper[4610]: I1006 09:06:12.179993 4610 scope.go:117] "RemoveContainer" containerID="49e2b9439c917bab9422d724bf8134ba7c14f441da31953fdf53987e82958c4e" Oct 06 09:06:16 crc kubenswrapper[4610]: I1006 09:06:16.469165 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:06:16 crc kubenswrapper[4610]: I1006 09:06:16.469787 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:06:46 crc kubenswrapper[4610]: I1006 09:06:46.469492 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:06:46 crc kubenswrapper[4610]: I1006 09:06:46.470014 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:07:12 crc kubenswrapper[4610]: I1006 09:07:12.337677 4610 scope.go:117] "RemoveContainer" containerID="882160670d83072988d19d3d783db490deed5d5939efd6625ed8a513ef1389ba" Oct 06 09:07:16 crc kubenswrapper[4610]: I1006 09:07:16.469134 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:07:16 crc kubenswrapper[4610]: I1006 09:07:16.469451 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:07:16 crc kubenswrapper[4610]: I1006 09:07:16.469522 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:07:16 crc kubenswrapper[4610]: I1006 09:07:16.470415 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:07:16 crc kubenswrapper[4610]: I1006 09:07:16.470488 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" gracePeriod=600 Oct 06 09:07:16 crc kubenswrapper[4610]: E1006 09:07:16.602386 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:07:17 crc kubenswrapper[4610]: I1006 09:07:17.273796 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" exitCode=0 Oct 06 09:07:17 crc kubenswrapper[4610]: I1006 09:07:17.273871 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95"} Oct 06 09:07:17 crc kubenswrapper[4610]: I1006 09:07:17.274138 4610 scope.go:117] "RemoveContainer" containerID="86993ec28f7e0d41d125c67a4926a12ed67d073648b7d992a6c8ef6e8c000659" Oct 06 09:07:17 crc kubenswrapper[4610]: I1006 09:07:17.274737 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:07:17 crc kubenswrapper[4610]: E1006 09:07:17.275126 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.819019 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.821471 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.826734 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg58x\" (UniqueName: \"kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.827068 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.827294 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.833186 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.929950 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.930480 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.930560 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.930566 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg58x\" (UniqueName: \"kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.931143 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:22 crc kubenswrapper[4610]: I1006 09:07:22.950070 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg58x\" (UniqueName: \"kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x\") pod \"community-operators-k8gbm\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:23 crc kubenswrapper[4610]: I1006 09:07:23.142534 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:23 crc kubenswrapper[4610]: I1006 09:07:23.659741 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:24 crc kubenswrapper[4610]: I1006 09:07:24.351162 4610 generic.go:334] "Generic (PLEG): container finished" podID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerID="ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308" exitCode=0 Oct 06 09:07:24 crc kubenswrapper[4610]: I1006 09:07:24.351448 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerDied","Data":"ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308"} Oct 06 09:07:24 crc kubenswrapper[4610]: I1006 09:07:24.351479 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerStarted","Data":"1d37c4bfa525cca81bc38aea23e9039c015697ad084f6ec16cb1d859b5e8de4a"} Oct 06 09:07:26 crc kubenswrapper[4610]: I1006 09:07:26.370063 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerStarted","Data":"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0"} Oct 06 09:07:27 crc kubenswrapper[4610]: I1006 09:07:27.380141 4610 generic.go:334] "Generic (PLEG): container finished" podID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerID="d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0" exitCode=0 Oct 06 09:07:27 crc kubenswrapper[4610]: I1006 09:07:27.380199 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerDied","Data":"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0"} Oct 06 09:07:29 crc kubenswrapper[4610]: I1006 09:07:29.457076 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerStarted","Data":"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae"} Oct 06 09:07:29 crc kubenswrapper[4610]: I1006 09:07:29.480863 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8gbm" podStartSLOduration=3.606441708 podStartE2EDuration="7.480844048s" podCreationTimestamp="2025-10-06 09:07:22 +0000 UTC" firstStartedPulling="2025-10-06 09:07:24.354029774 +0000 UTC m=+1576.069083152" lastFinishedPulling="2025-10-06 09:07:28.228432104 +0000 UTC m=+1579.943485492" observedRunningTime="2025-10-06 09:07:29.477573262 +0000 UTC m=+1581.192626650" watchObservedRunningTime="2025-10-06 09:07:29.480844048 +0000 UTC m=+1581.195897436" Oct 06 09:07:30 crc kubenswrapper[4610]: I1006 09:07:30.071244 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:07:30 crc kubenswrapper[4610]: E1006 09:07:30.071542 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:07:33 crc kubenswrapper[4610]: I1006 09:07:33.143175 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:33 crc kubenswrapper[4610]: I1006 09:07:33.144785 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:34 crc kubenswrapper[4610]: I1006 09:07:34.193253 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-k8gbm" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="registry-server" probeResult="failure" output=< Oct 06 09:07:34 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:07:34 crc kubenswrapper[4610]: > Oct 06 09:07:42 crc kubenswrapper[4610]: I1006 09:07:42.071029 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:07:42 crc kubenswrapper[4610]: E1006 09:07:42.072193 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:07:43 crc kubenswrapper[4610]: I1006 09:07:43.219847 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:43 crc kubenswrapper[4610]: I1006 09:07:43.273068 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:43 crc kubenswrapper[4610]: I1006 09:07:43.456088 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:44 crc kubenswrapper[4610]: I1006 09:07:44.647418 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8gbm" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="registry-server" containerID="cri-o://7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae" gracePeriod=2 Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.127175 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.288439 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content\") pod \"22f8a556-00d7-4613-a374-9e4a116bf9d9\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.290486 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities\") pod \"22f8a556-00d7-4613-a374-9e4a116bf9d9\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.290542 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg58x\" (UniqueName: \"kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x\") pod \"22f8a556-00d7-4613-a374-9e4a116bf9d9\" (UID: \"22f8a556-00d7-4613-a374-9e4a116bf9d9\") " Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.291434 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities" (OuterVolumeSpecName: "utilities") pod "22f8a556-00d7-4613-a374-9e4a116bf9d9" (UID: "22f8a556-00d7-4613-a374-9e4a116bf9d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.298754 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x" (OuterVolumeSpecName: "kube-api-access-tg58x") pod "22f8a556-00d7-4613-a374-9e4a116bf9d9" (UID: "22f8a556-00d7-4613-a374-9e4a116bf9d9"). InnerVolumeSpecName "kube-api-access-tg58x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.342247 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22f8a556-00d7-4613-a374-9e4a116bf9d9" (UID: "22f8a556-00d7-4613-a374-9e4a116bf9d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.392882 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.392919 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg58x\" (UniqueName: \"kubernetes.io/projected/22f8a556-00d7-4613-a374-9e4a116bf9d9-kube-api-access-tg58x\") on node \"crc\" DevicePath \"\"" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.392931 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f8a556-00d7-4613-a374-9e4a116bf9d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.659343 4610 generic.go:334] "Generic (PLEG): container finished" podID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerID="7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae" exitCode=0 Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.659635 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerDied","Data":"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae"} Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.660521 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gbm" event={"ID":"22f8a556-00d7-4613-a374-9e4a116bf9d9","Type":"ContainerDied","Data":"1d37c4bfa525cca81bc38aea23e9039c015697ad084f6ec16cb1d859b5e8de4a"} Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.659722 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gbm" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.660552 4610 scope.go:117] "RemoveContainer" containerID="7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.698570 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.700103 4610 scope.go:117] "RemoveContainer" containerID="d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.708584 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8gbm"] Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.735394 4610 scope.go:117] "RemoveContainer" containerID="ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.767014 4610 scope.go:117] "RemoveContainer" containerID="7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae" Oct 06 09:07:45 crc kubenswrapper[4610]: E1006 09:07:45.767736 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae\": container with ID starting with 7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae not found: ID does not exist" containerID="7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.767774 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae"} err="failed to get container status \"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae\": rpc error: code = NotFound desc = could not find container \"7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae\": container with ID starting with 7e58a5fe3eca5b3916e415515ad353c9c4df45ebd99c28a3df0c57ab5c4741ae not found: ID does not exist" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.767801 4610 scope.go:117] "RemoveContainer" containerID="d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0" Oct 06 09:07:45 crc kubenswrapper[4610]: E1006 09:07:45.768332 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0\": container with ID starting with d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0 not found: ID does not exist" containerID="d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.768366 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0"} err="failed to get container status \"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0\": rpc error: code = NotFound desc = could not find container \"d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0\": container with ID starting with d420c845e089fafd961dcae808b9ed1f10778c4dd28a72ef1d314c4f99a85fc0 not found: ID does not exist" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.768390 4610 scope.go:117] "RemoveContainer" containerID="ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308" Oct 06 09:07:45 crc kubenswrapper[4610]: E1006 09:07:45.768807 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308\": container with ID starting with ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308 not found: ID does not exist" containerID="ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308" Oct 06 09:07:45 crc kubenswrapper[4610]: I1006 09:07:45.768841 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308"} err="failed to get container status \"ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308\": rpc error: code = NotFound desc = could not find container \"ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308\": container with ID starting with ffc4de0267b04fca115347eddb9f38586ab52b22657ce37695f77c5230911308 not found: ID does not exist" Oct 06 09:07:47 crc kubenswrapper[4610]: I1006 09:07:47.084954 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" path="/var/lib/kubelet/pods/22f8a556-00d7-4613-a374-9e4a116bf9d9/volumes" Oct 06 09:07:53 crc kubenswrapper[4610]: I1006 09:07:53.070956 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:07:53 crc kubenswrapper[4610]: E1006 09:07:53.072109 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:08:07 crc kubenswrapper[4610]: I1006 09:08:07.071010 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:08:07 crc kubenswrapper[4610]: E1006 09:08:07.071808 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:08:19 crc kubenswrapper[4610]: I1006 09:08:19.078813 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:08:19 crc kubenswrapper[4610]: E1006 09:08:19.081827 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:08:33 crc kubenswrapper[4610]: I1006 09:08:33.076167 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:08:33 crc kubenswrapper[4610]: E1006 09:08:33.077440 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:08:48 crc kubenswrapper[4610]: I1006 09:08:48.070532 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:08:48 crc kubenswrapper[4610]: E1006 09:08:48.071435 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:09:00 crc kubenswrapper[4610]: I1006 09:09:00.071177 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:09:00 crc kubenswrapper[4610]: E1006 09:09:00.073387 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:09:13 crc kubenswrapper[4610]: I1006 09:09:13.070433 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:09:13 crc kubenswrapper[4610]: E1006 09:09:13.073491 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.064403 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mnlxm"] Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.071838 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v9lkn"] Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.080282 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mnlxm"] Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.088623 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v9lkn"] Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.699387 4610 generic.go:334] "Generic (PLEG): container finished" podID="290e102b-3121-4f44-b861-2b2e2e297f7b" containerID="2a33934bdbd9c72ec9b9eed91c5675e967ed094c25d019374d579ff7b59cbb28" exitCode=0 Oct 06 09:09:24 crc kubenswrapper[4610]: I1006 09:09:24.699464 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" event={"ID":"290e102b-3121-4f44-b861-2b2e2e297f7b","Type":"ContainerDied","Data":"2a33934bdbd9c72ec9b9eed91c5675e967ed094c25d019374d579ff7b59cbb28"} Oct 06 09:09:25 crc kubenswrapper[4610]: I1006 09:09:25.082998 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e" path="/var/lib/kubelet/pods/6f3c89a8-3d62-4323-b2a0-a7ebaad5ba4e/volumes" Oct 06 09:09:25 crc kubenswrapper[4610]: I1006 09:09:25.085479 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed19e15-b1d8-4a4b-84a5-7cf57afdb998" path="/var/lib/kubelet/pods/fed19e15-b1d8-4a4b-84a5-7cf57afdb998/volumes" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.114829 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.198712 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tjl\" (UniqueName: \"kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl\") pod \"290e102b-3121-4f44-b861-2b2e2e297f7b\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.198871 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle\") pod \"290e102b-3121-4f44-b861-2b2e2e297f7b\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.198952 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory\") pod \"290e102b-3121-4f44-b861-2b2e2e297f7b\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.199206 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key\") pod \"290e102b-3121-4f44-b861-2b2e2e297f7b\" (UID: \"290e102b-3121-4f44-b861-2b2e2e297f7b\") " Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.206390 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "290e102b-3121-4f44-b861-2b2e2e297f7b" (UID: "290e102b-3121-4f44-b861-2b2e2e297f7b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.209279 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl" (OuterVolumeSpecName: "kube-api-access-h2tjl") pod "290e102b-3121-4f44-b861-2b2e2e297f7b" (UID: "290e102b-3121-4f44-b861-2b2e2e297f7b"). InnerVolumeSpecName "kube-api-access-h2tjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.232365 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "290e102b-3121-4f44-b861-2b2e2e297f7b" (UID: "290e102b-3121-4f44-b861-2b2e2e297f7b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.232485 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory" (OuterVolumeSpecName: "inventory") pod "290e102b-3121-4f44-b861-2b2e2e297f7b" (UID: "290e102b-3121-4f44-b861-2b2e2e297f7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.302580 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.302625 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tjl\" (UniqueName: \"kubernetes.io/projected/290e102b-3121-4f44-b861-2b2e2e297f7b-kube-api-access-h2tjl\") on node \"crc\" DevicePath \"\"" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.302648 4610 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.302666 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/290e102b-3121-4f44-b861-2b2e2e297f7b-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.724295 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" event={"ID":"290e102b-3121-4f44-b861-2b2e2e297f7b","Type":"ContainerDied","Data":"737afbe7717e45291d2841b39ab2f8725c0feb233c5cc6934facf40fe815cf72"} Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.724693 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="737afbe7717e45291d2841b39ab2f8725c0feb233c5cc6934facf40fe815cf72" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.724413 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.862276 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz"] Oct 06 09:09:26 crc kubenswrapper[4610]: E1006 09:09:26.863267 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290e102b-3121-4f44-b861-2b2e2e297f7b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.863419 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="290e102b-3121-4f44-b861-2b2e2e297f7b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 09:09:26 crc kubenswrapper[4610]: E1006 09:09:26.863573 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="extract-content" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.863740 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="extract-content" Oct 06 09:09:26 crc kubenswrapper[4610]: E1006 09:09:26.863885 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="registry-server" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.864013 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="registry-server" Oct 06 09:09:26 crc kubenswrapper[4610]: E1006 09:09:26.864199 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="extract-utilities" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.864320 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="extract-utilities" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.864672 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="290e102b-3121-4f44-b861-2b2e2e297f7b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.864775 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f8a556-00d7-4613-a374-9e4a116bf9d9" containerName="registry-server" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.865531 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.868233 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.869983 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.869994 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.870985 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:09:26 crc kubenswrapper[4610]: I1006 09:09:26.874717 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz"] Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.015200 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8466\" (UniqueName: \"kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.015268 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.015303 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.117268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8466\" (UniqueName: \"kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.118205 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.118264 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.124069 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.127735 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.140801 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8466\" (UniqueName: \"kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.187999 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.711173 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz"] Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.718261 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:09:27 crc kubenswrapper[4610]: I1006 09:09:27.743106 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" event={"ID":"460008b6-6b5b-43a1-b806-01340e52e472","Type":"ContainerStarted","Data":"48096e8dcb09ddbdd399b0a1957466fc872bcaa88c8678eb339ee7f6402896d3"} Oct 06 09:09:28 crc kubenswrapper[4610]: I1006 09:09:28.070600 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:09:28 crc kubenswrapper[4610]: E1006 09:09:28.071148 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:09:28 crc kubenswrapper[4610]: I1006 09:09:28.753055 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" event={"ID":"460008b6-6b5b-43a1-b806-01340e52e472","Type":"ContainerStarted","Data":"ea8d97c5123732c5cd3cad90005c2a05681dc5d8790f41a99c9366ee4d834fd6"} Oct 06 09:09:28 crc kubenswrapper[4610]: I1006 09:09:28.772583 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" podStartSLOduration=2.566420448 podStartE2EDuration="2.772565668s" podCreationTimestamp="2025-10-06 09:09:26 +0000 UTC" firstStartedPulling="2025-10-06 09:09:27.717991149 +0000 UTC m=+1699.433044547" lastFinishedPulling="2025-10-06 09:09:27.924136379 +0000 UTC m=+1699.639189767" observedRunningTime="2025-10-06 09:09:28.769796375 +0000 UTC m=+1700.484849763" watchObservedRunningTime="2025-10-06 09:09:28.772565668 +0000 UTC m=+1700.487619056" Oct 06 09:09:31 crc kubenswrapper[4610]: I1006 09:09:31.026125 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jhdzb"] Oct 06 09:09:31 crc kubenswrapper[4610]: I1006 09:09:31.036877 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jhdzb"] Oct 06 09:09:31 crc kubenswrapper[4610]: I1006 09:09:31.080098 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e201fa9-8437-4672-9561-5eec037869f4" path="/var/lib/kubelet/pods/0e201fa9-8437-4672-9561-5eec037869f4/volumes" Oct 06 09:09:41 crc kubenswrapper[4610]: I1006 09:09:41.051890 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-09e4-account-create-dvf9l"] Oct 06 09:09:41 crc kubenswrapper[4610]: I1006 09:09:41.062477 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-09e4-account-create-dvf9l"] Oct 06 09:09:41 crc kubenswrapper[4610]: I1006 09:09:41.081887 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6ed3df-0417-488a-a4e2-26ab08498a9f" path="/var/lib/kubelet/pods/4f6ed3df-0417-488a-a4e2-26ab08498a9f/volumes" Oct 06 09:09:42 crc kubenswrapper[4610]: I1006 09:09:42.033267 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1c22-account-create-s6pwd"] Oct 06 09:09:42 crc kubenswrapper[4610]: I1006 09:09:42.046628 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-53ab-account-create-rwc72"] Oct 06 09:09:42 crc kubenswrapper[4610]: I1006 09:09:42.060232 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1c22-account-create-s6pwd"] Oct 06 09:09:42 crc kubenswrapper[4610]: I1006 09:09:42.069564 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-53ab-account-create-rwc72"] Oct 06 09:09:42 crc kubenswrapper[4610]: I1006 09:09:42.070332 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:09:42 crc kubenswrapper[4610]: E1006 09:09:42.070607 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:09:43 crc kubenswrapper[4610]: I1006 09:09:43.082825 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ca654c-0a66-443d-97d2-4788f5738c56" path="/var/lib/kubelet/pods/56ca654c-0a66-443d-97d2-4788f5738c56/volumes" Oct 06 09:09:43 crc kubenswrapper[4610]: I1006 09:09:43.085520 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb232da-0946-4113-9a7a-1aaea2706f8a" path="/var/lib/kubelet/pods/afb232da-0946-4113-9a7a-1aaea2706f8a/volumes" Oct 06 09:09:53 crc kubenswrapper[4610]: I1006 09:09:53.070667 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:09:53 crc kubenswrapper[4610]: E1006 09:09:53.071972 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:10:04 crc kubenswrapper[4610]: I1006 09:10:04.071141 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:10:04 crc kubenswrapper[4610]: E1006 09:10:04.071970 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.049339 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bx54l"] Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.062621 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-56r7l"] Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.070821 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jkgbb"] Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.079118 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bx54l"] Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.087117 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jkgbb"] Oct 06 09:10:06 crc kubenswrapper[4610]: I1006 09:10:06.098009 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-56r7l"] Oct 06 09:10:07 crc kubenswrapper[4610]: I1006 09:10:07.092466 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4039566a-c25d-4fad-b328-11b75d88c287" path="/var/lib/kubelet/pods/4039566a-c25d-4fad-b328-11b75d88c287/volumes" Oct 06 09:10:07 crc kubenswrapper[4610]: I1006 09:10:07.095859 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8040cf-2183-4b30-9a60-9b630ca829ea" path="/var/lib/kubelet/pods/8c8040cf-2183-4b30-9a60-9b630ca829ea/volumes" Oct 06 09:10:07 crc kubenswrapper[4610]: I1006 09:10:07.099021 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d404c7-55ef-4ce8-bb45-1a79923b8209" path="/var/lib/kubelet/pods/b9d404c7-55ef-4ce8-bb45-1a79923b8209/volumes" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.039359 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tqdxt"] Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.049345 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tqdxt"] Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.496842 4610 scope.go:117] "RemoveContainer" containerID="d0d0d99963c6860f1eb1c1dd0cdab4bfe9ee960295dd0ab2858629f9c5983a09" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.527346 4610 scope.go:117] "RemoveContainer" containerID="3198771bf3833d1503552fcf70333e19f97d0ba4391a828e86d22ad4df5a3d76" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.563605 4610 scope.go:117] "RemoveContainer" containerID="23d7f95edbc46b8ce7475b4ca3b2b2568bc9511ff9d1d2b80284a5f1515d2b96" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.625572 4610 scope.go:117] "RemoveContainer" containerID="6c873aab7d8e1dcbdc3e4aaf32ae629ceac137d8ab1f74a5604fe2a7d1eeb53b" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.706292 4610 scope.go:117] "RemoveContainer" containerID="6fd837b2ddd9e5f3595d8947e6c3bb3f7b2c576f04794118ccc1b4482ae2b424" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.745586 4610 scope.go:117] "RemoveContainer" containerID="4ffcc846c9a54e89ab72cc1bf3c89472d0835a2af9952e5fda2177a252f7ba3e" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.781640 4610 scope.go:117] "RemoveContainer" containerID="1ff080250b301a7b5da06d570b76228da02dbfcfacca82762233fe1364cac112" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.826652 4610 scope.go:117] "RemoveContainer" containerID="c20c5d6fc9be25993dde0e7cd30eeb92b71a0f467af479b17860c0cae2374f5a" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.890624 4610 scope.go:117] "RemoveContainer" containerID="5dfa7208bf947a680ecb804f43446466bbb58ce3891c43cb41255957abb288a0" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.922406 4610 scope.go:117] "RemoveContainer" containerID="aa4806e25b18e158c02f3180d6b58763ef94b014e53186e02713445e5d98a60a" Oct 06 09:10:12 crc kubenswrapper[4610]: I1006 09:10:12.962979 4610 scope.go:117] "RemoveContainer" containerID="e32daa257ea131795761e0658bb7dd9455ca5381cd00110a5200f7b175d9528f" Oct 06 09:10:13 crc kubenswrapper[4610]: I1006 09:10:13.036222 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-695gv"] Oct 06 09:10:13 crc kubenswrapper[4610]: I1006 09:10:13.058242 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-695gv"] Oct 06 09:10:13 crc kubenswrapper[4610]: I1006 09:10:13.082027 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84909105-862a-45a9-b78f-35406a385fa7" path="/var/lib/kubelet/pods/84909105-862a-45a9-b78f-35406a385fa7/volumes" Oct 06 09:10:13 crc kubenswrapper[4610]: I1006 09:10:13.084317 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae19c152-10a1-47dd-bb19-c00bf79b56c5" path="/var/lib/kubelet/pods/ae19c152-10a1-47dd-bb19-c00bf79b56c5/volumes" Oct 06 09:10:16 crc kubenswrapper[4610]: I1006 09:10:16.072366 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:10:16 crc kubenswrapper[4610]: E1006 09:10:16.073314 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.030921 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e32f-account-create-489xl"] Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.041369 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ac63-account-create-4btw5"] Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.054348 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e32f-account-create-489xl"] Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.064376 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cdf4-account-create-hfhbl"] Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.088034 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1274a89-0f59-489d-a02f-d08cc9513c2d" path="/var/lib/kubelet/pods/b1274a89-0f59-489d-a02f-d08cc9513c2d/volumes" Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.091130 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ac63-account-create-4btw5"] Oct 06 09:10:21 crc kubenswrapper[4610]: I1006 09:10:21.091164 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cdf4-account-create-hfhbl"] Oct 06 09:10:23 crc kubenswrapper[4610]: I1006 09:10:23.097734 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45050b7d-e181-4561-91ae-4ba8897b9daf" path="/var/lib/kubelet/pods/45050b7d-e181-4561-91ae-4ba8897b9daf/volumes" Oct 06 09:10:23 crc kubenswrapper[4610]: I1006 09:10:23.099549 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a181b0d5-7688-4cde-be34-8b7108abf09b" path="/var/lib/kubelet/pods/a181b0d5-7688-4cde-be34-8b7108abf09b/volumes" Oct 06 09:10:29 crc kubenswrapper[4610]: I1006 09:10:29.077160 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:10:29 crc kubenswrapper[4610]: E1006 09:10:29.077828 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:10:43 crc kubenswrapper[4610]: I1006 09:10:43.070821 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:10:43 crc kubenswrapper[4610]: E1006 09:10:43.071761 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:10:56 crc kubenswrapper[4610]: I1006 09:10:56.071471 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:10:56 crc kubenswrapper[4610]: E1006 09:10:56.072314 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:11:06 crc kubenswrapper[4610]: I1006 09:11:06.037257 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zx98b"] Oct 06 09:11:06 crc kubenswrapper[4610]: I1006 09:11:06.046760 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j2w82"] Oct 06 09:11:06 crc kubenswrapper[4610]: I1006 09:11:06.055584 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zx98b"] Oct 06 09:11:06 crc kubenswrapper[4610]: I1006 09:11:06.063339 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j2w82"] Oct 06 09:11:07 crc kubenswrapper[4610]: I1006 09:11:07.090081 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d65a6aa-72d2-4b32-b19f-b76c50c13bc8" path="/var/lib/kubelet/pods/3d65a6aa-72d2-4b32-b19f-b76c50c13bc8/volumes" Oct 06 09:11:07 crc kubenswrapper[4610]: I1006 09:11:07.091433 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ba2911-ba6a-40d2-b05e-011016c788c4" path="/var/lib/kubelet/pods/72ba2911-ba6a-40d2-b05e-011016c788c4/volumes" Oct 06 09:11:10 crc kubenswrapper[4610]: I1006 09:11:10.071033 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:11:10 crc kubenswrapper[4610]: E1006 09:11:10.071825 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.233337 4610 scope.go:117] "RemoveContainer" containerID="ca99402205ea54a0920134ae7e29d90ddbc983cc185e4df7dc02c32c47bee88d" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.282593 4610 scope.go:117] "RemoveContainer" containerID="7a89fd45c8d4064110c7d6cc28d42916714e4c5693e04f4e839f1fa6653ce909" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.314011 4610 scope.go:117] "RemoveContainer" containerID="8fc11d620c32dc677ba2489f9a9dec3111fa353c90450c26a863454a5386aada" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.359704 4610 scope.go:117] "RemoveContainer" containerID="cd989300cbe2b04dfc331e821ba340e62171907ab6709eecb44068e4ea383c98" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.393238 4610 scope.go:117] "RemoveContainer" containerID="5c48930a935627d2b67b8cc77df60f8cf3936c3af19624399f4d0c1c514d206d" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.439778 4610 scope.go:117] "RemoveContainer" containerID="ba1719a5f9c891597a29cdef91a60d65f74c69861626d3ef8a5507a01c7b27b0" Oct 06 09:11:13 crc kubenswrapper[4610]: I1006 09:11:13.494181 4610 scope.go:117] "RemoveContainer" containerID="7b9110bbd9f6c2daeaf12f8cde894c577ac67e6f1bab0874600c481ea5e6a446" Oct 06 09:11:14 crc kubenswrapper[4610]: I1006 09:11:14.766820 4610 generic.go:334] "Generic (PLEG): container finished" podID="460008b6-6b5b-43a1-b806-01340e52e472" containerID="ea8d97c5123732c5cd3cad90005c2a05681dc5d8790f41a99c9366ee4d834fd6" exitCode=0 Oct 06 09:11:14 crc kubenswrapper[4610]: I1006 09:11:14.766863 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" event={"ID":"460008b6-6b5b-43a1-b806-01340e52e472","Type":"ContainerDied","Data":"ea8d97c5123732c5cd3cad90005c2a05681dc5d8790f41a99c9366ee4d834fd6"} Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.215627 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.388371 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory\") pod \"460008b6-6b5b-43a1-b806-01340e52e472\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.388633 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8466\" (UniqueName: \"kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466\") pod \"460008b6-6b5b-43a1-b806-01340e52e472\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.388663 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key\") pod \"460008b6-6b5b-43a1-b806-01340e52e472\" (UID: \"460008b6-6b5b-43a1-b806-01340e52e472\") " Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.395271 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466" (OuterVolumeSpecName: "kube-api-access-n8466") pod "460008b6-6b5b-43a1-b806-01340e52e472" (UID: "460008b6-6b5b-43a1-b806-01340e52e472"). InnerVolumeSpecName "kube-api-access-n8466". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.415177 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "460008b6-6b5b-43a1-b806-01340e52e472" (UID: "460008b6-6b5b-43a1-b806-01340e52e472"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.417188 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory" (OuterVolumeSpecName: "inventory") pod "460008b6-6b5b-43a1-b806-01340e52e472" (UID: "460008b6-6b5b-43a1-b806-01340e52e472"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.490963 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8466\" (UniqueName: \"kubernetes.io/projected/460008b6-6b5b-43a1-b806-01340e52e472-kube-api-access-n8466\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.491012 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.491026 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460008b6-6b5b-43a1-b806-01340e52e472-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.789326 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" event={"ID":"460008b6-6b5b-43a1-b806-01340e52e472","Type":"ContainerDied","Data":"48096e8dcb09ddbdd399b0a1957466fc872bcaa88c8678eb339ee7f6402896d3"} Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.789641 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48096e8dcb09ddbdd399b0a1957466fc872bcaa88c8678eb339ee7f6402896d3" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.789707 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.938994 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6"] Oct 06 09:11:16 crc kubenswrapper[4610]: E1006 09:11:16.939481 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460008b6-6b5b-43a1-b806-01340e52e472" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.939506 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="460008b6-6b5b-43a1-b806-01340e52e472" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.939758 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="460008b6-6b5b-43a1-b806-01340e52e472" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.940542 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.943688 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.944493 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.947920 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.954769 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:11:16 crc kubenswrapper[4610]: I1006 09:11:16.964367 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6"] Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.101408 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgkh\" (UniqueName: \"kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.101591 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.101659 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.203118 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgkh\" (UniqueName: \"kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.203217 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.203268 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.207557 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.210902 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.223455 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgkh\" (UniqueName: \"kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.255760 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.785404 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6"] Oct 06 09:11:17 crc kubenswrapper[4610]: I1006 09:11:17.801243 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" event={"ID":"489d3203-794d-455a-b5b8-97f933b8db19","Type":"ContainerStarted","Data":"8d530e7ed3dc2a46fdd2c67c24ca2dd8c57a82299c2b4ce4e013957a5c5bb84d"} Oct 06 09:11:18 crc kubenswrapper[4610]: I1006 09:11:18.814169 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" event={"ID":"489d3203-794d-455a-b5b8-97f933b8db19","Type":"ContainerStarted","Data":"d2002af88941988a7539b004dd8a9139dee79c7f80654d87ac03b7b0a6622885"} Oct 06 09:11:18 crc kubenswrapper[4610]: I1006 09:11:18.838274 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" podStartSLOduration=2.646840389 podStartE2EDuration="2.83825361s" podCreationTimestamp="2025-10-06 09:11:16 +0000 UTC" firstStartedPulling="2025-10-06 09:11:17.78627772 +0000 UTC m=+1809.501331108" lastFinishedPulling="2025-10-06 09:11:17.977690941 +0000 UTC m=+1809.692744329" observedRunningTime="2025-10-06 09:11:18.835956099 +0000 UTC m=+1810.551009517" watchObservedRunningTime="2025-10-06 09:11:18.83825361 +0000 UTC m=+1810.553306998" Oct 06 09:11:19 crc kubenswrapper[4610]: I1006 09:11:19.033427 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-t9gmg"] Oct 06 09:11:19 crc kubenswrapper[4610]: I1006 09:11:19.044785 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-t9gmg"] Oct 06 09:11:19 crc kubenswrapper[4610]: I1006 09:11:19.086225 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4408d93d-c733-4032-92fc-df3c6d8d9b0b" path="/var/lib/kubelet/pods/4408d93d-c733-4032-92fc-df3c6d8d9b0b/volumes" Oct 06 09:11:22 crc kubenswrapper[4610]: I1006 09:11:22.070604 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:11:22 crc kubenswrapper[4610]: E1006 09:11:22.071177 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:11:33 crc kubenswrapper[4610]: I1006 09:11:33.026798 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-29xd8"] Oct 06 09:11:33 crc kubenswrapper[4610]: I1006 09:11:33.036470 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-29xd8"] Oct 06 09:11:33 crc kubenswrapper[4610]: I1006 09:11:33.085096 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5" path="/var/lib/kubelet/pods/32dd0fa5-c2b0-48dc-a81f-bfa7c58ecda5/volumes" Oct 06 09:11:35 crc kubenswrapper[4610]: I1006 09:11:35.070815 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:11:35 crc kubenswrapper[4610]: E1006 09:11:35.071337 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:11:39 crc kubenswrapper[4610]: I1006 09:11:39.040248 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jflcj"] Oct 06 09:11:39 crc kubenswrapper[4610]: I1006 09:11:39.058309 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jflcj"] Oct 06 09:11:39 crc kubenswrapper[4610]: I1006 09:11:39.088026 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca74dbf-7969-4a03-a618-83505fc9c7ec" path="/var/lib/kubelet/pods/2ca74dbf-7969-4a03-a618-83505fc9c7ec/volumes" Oct 06 09:11:49 crc kubenswrapper[4610]: I1006 09:11:49.082763 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:11:49 crc kubenswrapper[4610]: E1006 09:11:49.083932 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:12:03 crc kubenswrapper[4610]: I1006 09:12:03.071077 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:12:03 crc kubenswrapper[4610]: E1006 09:12:03.071963 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:12:13 crc kubenswrapper[4610]: I1006 09:12:13.769169 4610 scope.go:117] "RemoveContainer" containerID="f187f0499e60ca9263f4ef445bc5b30abc0ed6aa9779c414a635f7445ffc04f6" Oct 06 09:12:13 crc kubenswrapper[4610]: I1006 09:12:13.802909 4610 scope.go:117] "RemoveContainer" containerID="8bfc650ceeae3f526c02589de3817799b86fc030d345e9699069dc45c8242cc3" Oct 06 09:12:13 crc kubenswrapper[4610]: I1006 09:12:13.860335 4610 scope.go:117] "RemoveContainer" containerID="e286f948c1b89a9b3231a2a8a2d32f066aeffb1cfdc87f67ab136f30ff81987b" Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.045023 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dd97s"] Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.053997 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qrd6k"] Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.064892 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dd97s"] Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.075942 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qrd6k"] Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.083245 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-q6xqb"] Oct 06 09:12:14 crc kubenswrapper[4610]: I1006 09:12:14.091595 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-q6xqb"] Oct 06 09:12:15 crc kubenswrapper[4610]: I1006 09:12:15.081115 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf5de1e-fa0f-47d2-a549-35836fecffa8" path="/var/lib/kubelet/pods/1bf5de1e-fa0f-47d2-a549-35836fecffa8/volumes" Oct 06 09:12:15 crc kubenswrapper[4610]: I1006 09:12:15.082057 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69" path="/var/lib/kubelet/pods/9cd27a1b-1f42-4cb0-9d18-8fb1405e6a69/volumes" Oct 06 09:12:15 crc kubenswrapper[4610]: I1006 09:12:15.082567 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55" path="/var/lib/kubelet/pods/bcf183c7-2aaa-4ac6-9aae-ce3f12f19e55/volumes" Oct 06 09:12:18 crc kubenswrapper[4610]: I1006 09:12:18.070532 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:12:18 crc kubenswrapper[4610]: I1006 09:12:18.363165 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87"} Oct 06 09:12:23 crc kubenswrapper[4610]: I1006 09:12:23.038341 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8b9a-account-create-jt5xg"] Oct 06 09:12:23 crc kubenswrapper[4610]: I1006 09:12:23.049152 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8b9a-account-create-jt5xg"] Oct 06 09:12:23 crc kubenswrapper[4610]: I1006 09:12:23.083785 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a39c855-fc17-4cc4-af66-dc39f28fc009" path="/var/lib/kubelet/pods/4a39c855-fc17-4cc4-af66-dc39f28fc009/volumes" Oct 06 09:12:24 crc kubenswrapper[4610]: I1006 09:12:24.025702 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9537-account-create-4p6r5"] Oct 06 09:12:24 crc kubenswrapper[4610]: I1006 09:12:24.036430 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9537-account-create-4p6r5"] Oct 06 09:12:25 crc kubenswrapper[4610]: I1006 09:12:25.026498 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-844b-account-create-n8hvs"] Oct 06 09:12:25 crc kubenswrapper[4610]: I1006 09:12:25.035295 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-844b-account-create-n8hvs"] Oct 06 09:12:25 crc kubenswrapper[4610]: I1006 09:12:25.082917 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37db7d0c-8148-47dd-b730-b471fd07f6be" path="/var/lib/kubelet/pods/37db7d0c-8148-47dd-b730-b471fd07f6be/volumes" Oct 06 09:12:25 crc kubenswrapper[4610]: I1006 09:12:25.083833 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77db1d8-8747-4703-9eb7-80037220ecde" path="/var/lib/kubelet/pods/b77db1d8-8747-4703-9eb7-80037220ecde/volumes" Oct 06 09:12:33 crc kubenswrapper[4610]: I1006 09:12:33.472603 4610 generic.go:334] "Generic (PLEG): container finished" podID="489d3203-794d-455a-b5b8-97f933b8db19" containerID="d2002af88941988a7539b004dd8a9139dee79c7f80654d87ac03b7b0a6622885" exitCode=0 Oct 06 09:12:33 crc kubenswrapper[4610]: I1006 09:12:33.472683 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" event={"ID":"489d3203-794d-455a-b5b8-97f933b8db19","Type":"ContainerDied","Data":"d2002af88941988a7539b004dd8a9139dee79c7f80654d87ac03b7b0a6622885"} Oct 06 09:12:34 crc kubenswrapper[4610]: I1006 09:12:34.950428 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.137845 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory\") pod \"489d3203-794d-455a-b5b8-97f933b8db19\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.138323 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key\") pod \"489d3203-794d-455a-b5b8-97f933b8db19\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.138518 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhgkh\" (UniqueName: \"kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh\") pod \"489d3203-794d-455a-b5b8-97f933b8db19\" (UID: \"489d3203-794d-455a-b5b8-97f933b8db19\") " Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.146131 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh" (OuterVolumeSpecName: "kube-api-access-hhgkh") pod "489d3203-794d-455a-b5b8-97f933b8db19" (UID: "489d3203-794d-455a-b5b8-97f933b8db19"). InnerVolumeSpecName "kube-api-access-hhgkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.197881 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "489d3203-794d-455a-b5b8-97f933b8db19" (UID: "489d3203-794d-455a-b5b8-97f933b8db19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.203562 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory" (OuterVolumeSpecName: "inventory") pod "489d3203-794d-455a-b5b8-97f933b8db19" (UID: "489d3203-794d-455a-b5b8-97f933b8db19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.243484 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhgkh\" (UniqueName: \"kubernetes.io/projected/489d3203-794d-455a-b5b8-97f933b8db19-kube-api-access-hhgkh\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.243525 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.243537 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/489d3203-794d-455a-b5b8-97f933b8db19-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.490703 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" event={"ID":"489d3203-794d-455a-b5b8-97f933b8db19","Type":"ContainerDied","Data":"8d530e7ed3dc2a46fdd2c67c24ca2dd8c57a82299c2b4ce4e013957a5c5bb84d"} Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.490741 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d530e7ed3dc2a46fdd2c67c24ca2dd8c57a82299c2b4ce4e013957a5c5bb84d" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.490771 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.591203 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5"] Oct 06 09:12:35 crc kubenswrapper[4610]: E1006 09:12:35.591661 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489d3203-794d-455a-b5b8-97f933b8db19" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.591682 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="489d3203-794d-455a-b5b8-97f933b8db19" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.591899 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="489d3203-794d-455a-b5b8-97f933b8db19" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.592831 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.595583 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.595916 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.595944 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.595950 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.606156 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5"] Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.651359 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.651467 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.651522 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbps\" (UniqueName: \"kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.752737 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.753503 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.753664 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbps\" (UniqueName: \"kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.756274 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.756285 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.772303 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbps\" (UniqueName: \"kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:35 crc kubenswrapper[4610]: I1006 09:12:35.926991 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:36 crc kubenswrapper[4610]: I1006 09:12:36.423911 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5"] Oct 06 09:12:36 crc kubenswrapper[4610]: I1006 09:12:36.499742 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" event={"ID":"d37ed6ae-3ad3-4604-9149-4e2b8006375e","Type":"ContainerStarted","Data":"dd77ad3f6445f697b2b3a1afc0bceeb047da38b90d7639e0a1cfa1caf5e1044e"} Oct 06 09:12:37 crc kubenswrapper[4610]: I1006 09:12:37.510437 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" event={"ID":"d37ed6ae-3ad3-4604-9149-4e2b8006375e","Type":"ContainerStarted","Data":"e1ff8864aee9f3bf55fa6378bed00661e1138c7adb2478fc73ccafd3860da4f9"} Oct 06 09:12:37 crc kubenswrapper[4610]: I1006 09:12:37.532814 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" podStartSLOduration=2.3626546299999998 podStartE2EDuration="2.53279956s" podCreationTimestamp="2025-10-06 09:12:35 +0000 UTC" firstStartedPulling="2025-10-06 09:12:36.441195074 +0000 UTC m=+1888.156248472" lastFinishedPulling="2025-10-06 09:12:36.611340014 +0000 UTC m=+1888.326393402" observedRunningTime="2025-10-06 09:12:37.526585686 +0000 UTC m=+1889.241639064" watchObservedRunningTime="2025-10-06 09:12:37.53279956 +0000 UTC m=+1889.247852938" Oct 06 09:12:42 crc kubenswrapper[4610]: I1006 09:12:42.573390 4610 generic.go:334] "Generic (PLEG): container finished" podID="d37ed6ae-3ad3-4604-9149-4e2b8006375e" containerID="e1ff8864aee9f3bf55fa6378bed00661e1138c7adb2478fc73ccafd3860da4f9" exitCode=0 Oct 06 09:12:42 crc kubenswrapper[4610]: I1006 09:12:42.573467 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" event={"ID":"d37ed6ae-3ad3-4604-9149-4e2b8006375e","Type":"ContainerDied","Data":"e1ff8864aee9f3bf55fa6378bed00661e1138c7adb2478fc73ccafd3860da4f9"} Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:43.996534 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.024316 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbps\" (UniqueName: \"kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps\") pod \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.025675 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory\") pod \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.025719 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key\") pod \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\" (UID: \"d37ed6ae-3ad3-4604-9149-4e2b8006375e\") " Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.059025 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps" (OuterVolumeSpecName: "kube-api-access-fzbps") pod "d37ed6ae-3ad3-4604-9149-4e2b8006375e" (UID: "d37ed6ae-3ad3-4604-9149-4e2b8006375e"). InnerVolumeSpecName "kube-api-access-fzbps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.067553 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory" (OuterVolumeSpecName: "inventory") pod "d37ed6ae-3ad3-4604-9149-4e2b8006375e" (UID: "d37ed6ae-3ad3-4604-9149-4e2b8006375e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.071607 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d37ed6ae-3ad3-4604-9149-4e2b8006375e" (UID: "d37ed6ae-3ad3-4604-9149-4e2b8006375e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.129236 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbps\" (UniqueName: \"kubernetes.io/projected/d37ed6ae-3ad3-4604-9149-4e2b8006375e-kube-api-access-fzbps\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.129296 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.129305 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d37ed6ae-3ad3-4604-9149-4e2b8006375e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.598316 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" event={"ID":"d37ed6ae-3ad3-4604-9149-4e2b8006375e","Type":"ContainerDied","Data":"dd77ad3f6445f697b2b3a1afc0bceeb047da38b90d7639e0a1cfa1caf5e1044e"} Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.598861 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd77ad3f6445f697b2b3a1afc0bceeb047da38b90d7639e0a1cfa1caf5e1044e" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.598383 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.683991 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf"] Oct 06 09:12:44 crc kubenswrapper[4610]: E1006 09:12:44.684358 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37ed6ae-3ad3-4604-9149-4e2b8006375e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.684371 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37ed6ae-3ad3-4604-9149-4e2b8006375e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.684581 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37ed6ae-3ad3-4604-9149-4e2b8006375e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.685161 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.687851 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.688839 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.689119 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.689220 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.725395 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf"] Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.743485 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckgs\" (UniqueName: \"kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.743616 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.743701 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.846412 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nckgs\" (UniqueName: \"kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.846962 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.848111 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.852793 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.853988 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:44 crc kubenswrapper[4610]: I1006 09:12:44.862169 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckgs\" (UniqueName: \"kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rcwcf\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:45 crc kubenswrapper[4610]: I1006 09:12:45.011804 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:12:45 crc kubenswrapper[4610]: I1006 09:12:45.658145 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf"] Oct 06 09:12:46 crc kubenswrapper[4610]: I1006 09:12:46.620395 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" event={"ID":"b1e06674-8934-4170-9b16-5bf7292977ff","Type":"ContainerStarted","Data":"b50c0b82734ca1ffdcf915caa15307b5ca9e03bd7c5e7e0ade907c04fa546b66"} Oct 06 09:12:46 crc kubenswrapper[4610]: I1006 09:12:46.622682 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" event={"ID":"b1e06674-8934-4170-9b16-5bf7292977ff","Type":"ContainerStarted","Data":"031949bcfc6dda91a1451137aafa03d1d3778ab94307899a147057d9be4dd2f9"} Oct 06 09:12:46 crc kubenswrapper[4610]: I1006 09:12:46.651926 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" podStartSLOduration=2.477040973 podStartE2EDuration="2.651895567s" podCreationTimestamp="2025-10-06 09:12:44 +0000 UTC" firstStartedPulling="2025-10-06 09:12:45.675170924 +0000 UTC m=+1897.390224312" lastFinishedPulling="2025-10-06 09:12:45.850025518 +0000 UTC m=+1897.565078906" observedRunningTime="2025-10-06 09:12:46.640144227 +0000 UTC m=+1898.355197625" watchObservedRunningTime="2025-10-06 09:12:46.651895567 +0000 UTC m=+1898.366948995" Oct 06 09:12:51 crc kubenswrapper[4610]: I1006 09:12:51.036475 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8mxh7"] Oct 06 09:12:51 crc kubenswrapper[4610]: I1006 09:12:51.042566 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8mxh7"] Oct 06 09:12:51 crc kubenswrapper[4610]: I1006 09:12:51.081347 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16262603-bfa5-485b-bcbc-c61b3390f964" path="/var/lib/kubelet/pods/16262603-bfa5-485b-bcbc-c61b3390f964/volumes" Oct 06 09:13:11 crc kubenswrapper[4610]: I1006 09:13:11.054088 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m9pmg"] Oct 06 09:13:11 crc kubenswrapper[4610]: I1006 09:13:11.063309 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m9pmg"] Oct 06 09:13:11 crc kubenswrapper[4610]: I1006 09:13:11.081680 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7411bbf-edb9-450b-ae3b-02ccaa0dd04a" path="/var/lib/kubelet/pods/d7411bbf-edb9-450b-ae3b-02ccaa0dd04a/volumes" Oct 06 09:13:13 crc kubenswrapper[4610]: I1006 09:13:13.039539 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vflrs"] Oct 06 09:13:13 crc kubenswrapper[4610]: I1006 09:13:13.054500 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vflrs"] Oct 06 09:13:13 crc kubenswrapper[4610]: I1006 09:13:13.086356 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ed6294-5577-470e-8571-199cc7cc777d" path="/var/lib/kubelet/pods/f6ed6294-5577-470e-8571-199cc7cc777d/volumes" Oct 06 09:13:13 crc kubenswrapper[4610]: I1006 09:13:13.977881 4610 scope.go:117] "RemoveContainer" containerID="ca5fdc62c020cdfbae7862d83298cb2627b8031bbecc60a7d4942edf1bab1b9a" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.043357 4610 scope.go:117] "RemoveContainer" containerID="ffc63b5c1a5a389db05fa90cde7d066e8b58f29bd02a274b951ca374e62b233b" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.116385 4610 scope.go:117] "RemoveContainer" containerID="3f323c5f23ca573f4bf0fe819e8c7276efae1d8ad46ba6ec6fecde023a07678c" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.175414 4610 scope.go:117] "RemoveContainer" containerID="23893b0ed63a7f063f18ed0d010084ae8a9b47a2b0ce1805d5896a565049824e" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.198572 4610 scope.go:117] "RemoveContainer" containerID="97ae70e2be6928e0392c0c63ed74a6b0bdaabeed2465d4ad6f18deaf921c2765" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.243812 4610 scope.go:117] "RemoveContainer" containerID="5c39e5162ee2d6e80f8972919d2e53019c1bed87cb41ad4215c3375c93e54f48" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.290451 4610 scope.go:117] "RemoveContainer" containerID="05f9f58f875be83673aae401d57ca88b0deefa5852be2171880ae839549e78d1" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.335250 4610 scope.go:117] "RemoveContainer" containerID="4c9235706a689e81928754fc27d5d22c21347038c035b4371958dd2d2d7b1be4" Oct 06 09:13:14 crc kubenswrapper[4610]: I1006 09:13:14.356268 4610 scope.go:117] "RemoveContainer" containerID="275591c38c0a503746f7e21e1b9c9aedbb794ecff6c31ad9ee165a624f5ac971" Oct 06 09:13:31 crc kubenswrapper[4610]: I1006 09:13:31.075541 4610 generic.go:334] "Generic (PLEG): container finished" podID="b1e06674-8934-4170-9b16-5bf7292977ff" containerID="b50c0b82734ca1ffdcf915caa15307b5ca9e03bd7c5e7e0ade907c04fa546b66" exitCode=0 Oct 06 09:13:31 crc kubenswrapper[4610]: I1006 09:13:31.083173 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" event={"ID":"b1e06674-8934-4170-9b16-5bf7292977ff","Type":"ContainerDied","Data":"b50c0b82734ca1ffdcf915caa15307b5ca9e03bd7c5e7e0ade907c04fa546b66"} Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.715648 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.827686 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nckgs\" (UniqueName: \"kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs\") pod \"b1e06674-8934-4170-9b16-5bf7292977ff\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.828242 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key\") pod \"b1e06674-8934-4170-9b16-5bf7292977ff\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.828434 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory\") pod \"b1e06674-8934-4170-9b16-5bf7292977ff\" (UID: \"b1e06674-8934-4170-9b16-5bf7292977ff\") " Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.837585 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs" (OuterVolumeSpecName: "kube-api-access-nckgs") pod "b1e06674-8934-4170-9b16-5bf7292977ff" (UID: "b1e06674-8934-4170-9b16-5bf7292977ff"). InnerVolumeSpecName "kube-api-access-nckgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.880873 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory" (OuterVolumeSpecName: "inventory") pod "b1e06674-8934-4170-9b16-5bf7292977ff" (UID: "b1e06674-8934-4170-9b16-5bf7292977ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.890753 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b1e06674-8934-4170-9b16-5bf7292977ff" (UID: "b1e06674-8934-4170-9b16-5bf7292977ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.931459 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nckgs\" (UniqueName: \"kubernetes.io/projected/b1e06674-8934-4170-9b16-5bf7292977ff-kube-api-access-nckgs\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.931510 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:32 crc kubenswrapper[4610]: I1006 09:13:32.931535 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1e06674-8934-4170-9b16-5bf7292977ff-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.096367 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" event={"ID":"b1e06674-8934-4170-9b16-5bf7292977ff","Type":"ContainerDied","Data":"031949bcfc6dda91a1451137aafa03d1d3778ab94307899a147057d9be4dd2f9"} Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.096434 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031949bcfc6dda91a1451137aafa03d1d3778ab94307899a147057d9be4dd2f9" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.096812 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rcwcf" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.345151 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6"] Oct 06 09:13:33 crc kubenswrapper[4610]: E1006 09:13:33.345683 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e06674-8934-4170-9b16-5bf7292977ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.345706 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e06674-8934-4170-9b16-5bf7292977ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.345945 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e06674-8934-4170-9b16-5bf7292977ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.346726 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.349119 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.349355 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.349516 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.351523 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.355886 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6"] Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.440387 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.440642 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.440881 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndw7b\" (UniqueName: \"kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.542165 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndw7b\" (UniqueName: \"kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.542260 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.542340 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.548681 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.552672 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.568390 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndw7b\" (UniqueName: \"kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:33 crc kubenswrapper[4610]: I1006 09:13:33.661437 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:13:34 crc kubenswrapper[4610]: I1006 09:13:34.243836 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6"] Oct 06 09:13:35 crc kubenswrapper[4610]: I1006 09:13:35.117155 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" event={"ID":"e529d0f1-3da5-4178-8720-5769624f4490","Type":"ContainerStarted","Data":"51c133e4faad8dbe4d62d9f9b7dae406ebb69337ef537e71479102adcd3d9987"} Oct 06 09:13:35 crc kubenswrapper[4610]: I1006 09:13:35.117560 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" event={"ID":"e529d0f1-3da5-4178-8720-5769624f4490","Type":"ContainerStarted","Data":"79be6583b0e6c55684c7e5fc03cd683ee01fd5bd9b5ed3bf401ee41bd6248523"} Oct 06 09:13:35 crc kubenswrapper[4610]: I1006 09:13:35.141331 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" podStartSLOduration=1.934943864 podStartE2EDuration="2.141306657s" podCreationTimestamp="2025-10-06 09:13:33 +0000 UTC" firstStartedPulling="2025-10-06 09:13:34.275715799 +0000 UTC m=+1945.990769187" lastFinishedPulling="2025-10-06 09:13:34.482078592 +0000 UTC m=+1946.197131980" observedRunningTime="2025-10-06 09:13:35.137799194 +0000 UTC m=+1946.852852582" watchObservedRunningTime="2025-10-06 09:13:35.141306657 +0000 UTC m=+1946.856360085" Oct 06 09:13:54 crc kubenswrapper[4610]: I1006 09:13:54.038477 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vvmt"] Oct 06 09:13:54 crc kubenswrapper[4610]: I1006 09:13:54.050853 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8vvmt"] Oct 06 09:13:55 crc kubenswrapper[4610]: I1006 09:13:55.081140 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517bbf7b-880c-4564-b328-92d0bbf01003" path="/var/lib/kubelet/pods/517bbf7b-880c-4564-b328-92d0bbf01003/volumes" Oct 06 09:14:14 crc kubenswrapper[4610]: I1006 09:14:14.535117 4610 scope.go:117] "RemoveContainer" containerID="9e3198f305697609d3b465adfe366344753ac76f62aa4825f338c570a980de61" Oct 06 09:14:35 crc kubenswrapper[4610]: I1006 09:14:35.875169 4610 generic.go:334] "Generic (PLEG): container finished" podID="e529d0f1-3da5-4178-8720-5769624f4490" containerID="51c133e4faad8dbe4d62d9f9b7dae406ebb69337ef537e71479102adcd3d9987" exitCode=2 Oct 06 09:14:35 crc kubenswrapper[4610]: I1006 09:14:35.875274 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" event={"ID":"e529d0f1-3da5-4178-8720-5769624f4490","Type":"ContainerDied","Data":"51c133e4faad8dbe4d62d9f9b7dae406ebb69337ef537e71479102adcd3d9987"} Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.410373 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.537854 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory\") pod \"e529d0f1-3da5-4178-8720-5769624f4490\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.537923 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key\") pod \"e529d0f1-3da5-4178-8720-5769624f4490\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.537977 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndw7b\" (UniqueName: \"kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b\") pod \"e529d0f1-3da5-4178-8720-5769624f4490\" (UID: \"e529d0f1-3da5-4178-8720-5769624f4490\") " Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.544260 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b" (OuterVolumeSpecName: "kube-api-access-ndw7b") pod "e529d0f1-3da5-4178-8720-5769624f4490" (UID: "e529d0f1-3da5-4178-8720-5769624f4490"). InnerVolumeSpecName "kube-api-access-ndw7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.570855 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e529d0f1-3da5-4178-8720-5769624f4490" (UID: "e529d0f1-3da5-4178-8720-5769624f4490"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.575525 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory" (OuterVolumeSpecName: "inventory") pod "e529d0f1-3da5-4178-8720-5769624f4490" (UID: "e529d0f1-3da5-4178-8720-5769624f4490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.640759 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.640802 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e529d0f1-3da5-4178-8720-5769624f4490-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.640817 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndw7b\" (UniqueName: \"kubernetes.io/projected/e529d0f1-3da5-4178-8720-5769624f4490-kube-api-access-ndw7b\") on node \"crc\" DevicePath \"\"" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.899871 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" event={"ID":"e529d0f1-3da5-4178-8720-5769624f4490","Type":"ContainerDied","Data":"79be6583b0e6c55684c7e5fc03cd683ee01fd5bd9b5ed3bf401ee41bd6248523"} Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.899928 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79be6583b0e6c55684c7e5fc03cd683ee01fd5bd9b5ed3bf401ee41bd6248523" Oct 06 09:14:37 crc kubenswrapper[4610]: I1006 09:14:37.900158 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.055156 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd"] Oct 06 09:14:45 crc kubenswrapper[4610]: E1006 09:14:45.056438 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e529d0f1-3da5-4178-8720-5769624f4490" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.056462 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e529d0f1-3da5-4178-8720-5769624f4490" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.056856 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e529d0f1-3da5-4178-8720-5769624f4490" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.058191 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.062527 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.063025 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.063268 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.063324 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd"] Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.064305 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.204503 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.204759 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7g7d\" (UniqueName: \"kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.205010 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.306966 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.307066 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7g7d\" (UniqueName: \"kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.307145 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.319927 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.321054 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.323825 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7g7d\" (UniqueName: \"kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-687wd\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.387700 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.617085 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.622531 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.631564 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.713288 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.713347 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.713559 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sdq\" (UniqueName: \"kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.815818 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.816161 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.816240 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sdq\" (UniqueName: \"kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.816247 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.816583 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.852351 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sdq\" (UniqueName: \"kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq\") pod \"redhat-operators-5vfzs\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.907500 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd"] Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.911584 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.953444 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:45 crc kubenswrapper[4610]: I1006 09:14:45.984815 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" event={"ID":"b87a175d-5d06-4825-981c-ed2cf97fb652","Type":"ContainerStarted","Data":"82d550e035a9f8dc0326c65b2529a5b87cc0d1c4323dd71f21354768c64bc198"} Oct 06 09:14:46 crc kubenswrapper[4610]: I1006 09:14:46.451245 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:14:46 crc kubenswrapper[4610]: I1006 09:14:46.469481 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:14:46 crc kubenswrapper[4610]: I1006 09:14:46.469525 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:14:47 crc kubenswrapper[4610]: I1006 09:14:47.003628 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" event={"ID":"b87a175d-5d06-4825-981c-ed2cf97fb652","Type":"ContainerStarted","Data":"8565817acccfbb8869b72d6868df94767462ff692a63af0ed9ebf0b1a9b7e8b9"} Oct 06 09:14:47 crc kubenswrapper[4610]: I1006 09:14:47.007233 4610 generic.go:334] "Generic (PLEG): container finished" podID="7b1a8891-4621-4216-b50b-63af59fd908a" containerID="168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb" exitCode=0 Oct 06 09:14:47 crc kubenswrapper[4610]: I1006 09:14:47.007262 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerDied","Data":"168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb"} Oct 06 09:14:47 crc kubenswrapper[4610]: I1006 09:14:47.007277 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerStarted","Data":"85dd7d27e5946ff3f14b7b5a147a243b08f4b6505ca7cbce43f4e95034498df1"} Oct 06 09:14:47 crc kubenswrapper[4610]: I1006 09:14:47.039456 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" podStartSLOduration=1.889344906 podStartE2EDuration="2.039424211s" podCreationTimestamp="2025-10-06 09:14:45 +0000 UTC" firstStartedPulling="2025-10-06 09:14:45.911381707 +0000 UTC m=+2017.626435095" lastFinishedPulling="2025-10-06 09:14:46.061461012 +0000 UTC m=+2017.776514400" observedRunningTime="2025-10-06 09:14:47.035411314 +0000 UTC m=+2018.750464772" watchObservedRunningTime="2025-10-06 09:14:47.039424211 +0000 UTC m=+2018.754477629" Oct 06 09:14:49 crc kubenswrapper[4610]: I1006 09:14:49.023654 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerStarted","Data":"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841"} Oct 06 09:14:52 crc kubenswrapper[4610]: I1006 09:14:52.076384 4610 generic.go:334] "Generic (PLEG): container finished" podID="7b1a8891-4621-4216-b50b-63af59fd908a" containerID="5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841" exitCode=0 Oct 06 09:14:52 crc kubenswrapper[4610]: I1006 09:14:52.076446 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerDied","Data":"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841"} Oct 06 09:14:53 crc kubenswrapper[4610]: I1006 09:14:53.089804 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerStarted","Data":"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801"} Oct 06 09:14:53 crc kubenswrapper[4610]: I1006 09:14:53.125807 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5vfzs" podStartSLOduration=2.664677052 podStartE2EDuration="8.125787141s" podCreationTimestamp="2025-10-06 09:14:45 +0000 UTC" firstStartedPulling="2025-10-06 09:14:47.011813676 +0000 UTC m=+2018.726867114" lastFinishedPulling="2025-10-06 09:14:52.472923815 +0000 UTC m=+2024.187977203" observedRunningTime="2025-10-06 09:14:53.115676192 +0000 UTC m=+2024.830729590" watchObservedRunningTime="2025-10-06 09:14:53.125787141 +0000 UTC m=+2024.840840529" Oct 06 09:14:55 crc kubenswrapper[4610]: I1006 09:14:55.954839 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:55 crc kubenswrapper[4610]: I1006 09:14:55.955520 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:14:57 crc kubenswrapper[4610]: I1006 09:14:57.013120 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5vfzs" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:14:57 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:14:57 crc kubenswrapper[4610]: > Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.144141 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6"] Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.146505 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.153794 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.154096 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.195726 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6"] Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.217575 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.217886 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.217946 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtldx\" (UniqueName: \"kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.319690 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.319754 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtldx\" (UniqueName: \"kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.319911 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.320627 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.325213 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.336555 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtldx\" (UniqueName: \"kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx\") pod \"collect-profiles-29329035-l7sc6\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.470649 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:00 crc kubenswrapper[4610]: I1006 09:15:00.951201 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6"] Oct 06 09:15:00 crc kubenswrapper[4610]: W1006 09:15:00.961718 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b83115b_6506_4f12_8b07_a850a423dc9b.slice/crio-353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62 WatchSource:0}: Error finding container 353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62: Status 404 returned error can't find the container with id 353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62 Oct 06 09:15:01 crc kubenswrapper[4610]: I1006 09:15:01.192394 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" event={"ID":"6b83115b-6506-4f12-8b07-a850a423dc9b","Type":"ContainerStarted","Data":"4c2990495f0d3f8012ae321831baa99006b91b0817fbcb21760344ae7c9cdfbe"} Oct 06 09:15:01 crc kubenswrapper[4610]: I1006 09:15:01.193098 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" event={"ID":"6b83115b-6506-4f12-8b07-a850a423dc9b","Type":"ContainerStarted","Data":"353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62"} Oct 06 09:15:02 crc kubenswrapper[4610]: I1006 09:15:02.202671 4610 generic.go:334] "Generic (PLEG): container finished" podID="6b83115b-6506-4f12-8b07-a850a423dc9b" containerID="4c2990495f0d3f8012ae321831baa99006b91b0817fbcb21760344ae7c9cdfbe" exitCode=0 Oct 06 09:15:02 crc kubenswrapper[4610]: I1006 09:15:02.202722 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" event={"ID":"6b83115b-6506-4f12-8b07-a850a423dc9b","Type":"ContainerDied","Data":"4c2990495f0d3f8012ae321831baa99006b91b0817fbcb21760344ae7c9cdfbe"} Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.609683 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.682222 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume\") pod \"6b83115b-6506-4f12-8b07-a850a423dc9b\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.682535 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtldx\" (UniqueName: \"kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx\") pod \"6b83115b-6506-4f12-8b07-a850a423dc9b\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.682603 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume\") pod \"6b83115b-6506-4f12-8b07-a850a423dc9b\" (UID: \"6b83115b-6506-4f12-8b07-a850a423dc9b\") " Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.683372 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b83115b-6506-4f12-8b07-a850a423dc9b" (UID: "6b83115b-6506-4f12-8b07-a850a423dc9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.694789 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b83115b-6506-4f12-8b07-a850a423dc9b" (UID: "6b83115b-6506-4f12-8b07-a850a423dc9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.695257 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx" (OuterVolumeSpecName: "kube-api-access-vtldx") pod "6b83115b-6506-4f12-8b07-a850a423dc9b" (UID: "6b83115b-6506-4f12-8b07-a850a423dc9b"). InnerVolumeSpecName "kube-api-access-vtldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.784707 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b83115b-6506-4f12-8b07-a850a423dc9b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.784752 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtldx\" (UniqueName: \"kubernetes.io/projected/6b83115b-6506-4f12-8b07-a850a423dc9b-kube-api-access-vtldx\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:03 crc kubenswrapper[4610]: I1006 09:15:03.784764 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b83115b-6506-4f12-8b07-a850a423dc9b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:04 crc kubenswrapper[4610]: I1006 09:15:04.220946 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" event={"ID":"6b83115b-6506-4f12-8b07-a850a423dc9b","Type":"ContainerDied","Data":"353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62"} Oct 06 09:15:04 crc kubenswrapper[4610]: I1006 09:15:04.221019 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353232149ee8c7a8143f66016c3353c37656abeb7d5ff1adb2a5b519ae01da62" Oct 06 09:15:04 crc kubenswrapper[4610]: I1006 09:15:04.220979 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6" Oct 06 09:15:04 crc kubenswrapper[4610]: I1006 09:15:04.284441 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59"] Oct 06 09:15:04 crc kubenswrapper[4610]: I1006 09:15:04.290976 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-22k59"] Oct 06 09:15:05 crc kubenswrapper[4610]: I1006 09:15:05.082026 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f2291f3-fb1c-4d23-9f78-59ef302b5c02" path="/var/lib/kubelet/pods/6f2291f3-fb1c-4d23-9f78-59ef302b5c02/volumes" Oct 06 09:15:07 crc kubenswrapper[4610]: I1006 09:15:07.005731 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5vfzs" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:15:07 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:15:07 crc kubenswrapper[4610]: > Oct 06 09:15:14 crc kubenswrapper[4610]: I1006 09:15:14.667192 4610 scope.go:117] "RemoveContainer" containerID="dda44c10233a7de7bbe4e18440e175fafc2a7afda4e162068e939d1e77031fa0" Oct 06 09:15:16 crc kubenswrapper[4610]: I1006 09:15:16.007245 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:15:16 crc kubenswrapper[4610]: I1006 09:15:16.079512 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:15:16 crc kubenswrapper[4610]: I1006 09:15:16.469515 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:15:16 crc kubenswrapper[4610]: I1006 09:15:16.469584 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:15:16 crc kubenswrapper[4610]: I1006 09:15:16.823002 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.350709 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5vfzs" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" containerID="cri-o://bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801" gracePeriod=2 Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.784819 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.893711 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content\") pod \"7b1a8891-4621-4216-b50b-63af59fd908a\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.893862 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities\") pod \"7b1a8891-4621-4216-b50b-63af59fd908a\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.893970 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4sdq\" (UniqueName: \"kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq\") pod \"7b1a8891-4621-4216-b50b-63af59fd908a\" (UID: \"7b1a8891-4621-4216-b50b-63af59fd908a\") " Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.894568 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities" (OuterVolumeSpecName: "utilities") pod "7b1a8891-4621-4216-b50b-63af59fd908a" (UID: "7b1a8891-4621-4216-b50b-63af59fd908a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.900544 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq" (OuterVolumeSpecName: "kube-api-access-j4sdq") pod "7b1a8891-4621-4216-b50b-63af59fd908a" (UID: "7b1a8891-4621-4216-b50b-63af59fd908a"). InnerVolumeSpecName "kube-api-access-j4sdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.960254 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b1a8891-4621-4216-b50b-63af59fd908a" (UID: "7b1a8891-4621-4216-b50b-63af59fd908a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.995864 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4sdq\" (UniqueName: \"kubernetes.io/projected/7b1a8891-4621-4216-b50b-63af59fd908a-kube-api-access-j4sdq\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.995899 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:17 crc kubenswrapper[4610]: I1006 09:15:17.995908 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1a8891-4621-4216-b50b-63af59fd908a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.363196 4610 generic.go:334] "Generic (PLEG): container finished" podID="7b1a8891-4621-4216-b50b-63af59fd908a" containerID="bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801" exitCode=0 Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.363222 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vfzs" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.363243 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerDied","Data":"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801"} Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.364796 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vfzs" event={"ID":"7b1a8891-4621-4216-b50b-63af59fd908a","Type":"ContainerDied","Data":"85dd7d27e5946ff3f14b7b5a147a243b08f4b6505ca7cbce43f4e95034498df1"} Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.364818 4610 scope.go:117] "RemoveContainer" containerID="bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.393158 4610 scope.go:117] "RemoveContainer" containerID="5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.423616 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.431011 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5vfzs"] Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.432477 4610 scope.go:117] "RemoveContainer" containerID="168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.463221 4610 scope.go:117] "RemoveContainer" containerID="bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801" Oct 06 09:15:18 crc kubenswrapper[4610]: E1006 09:15:18.468038 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801\": container with ID starting with bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801 not found: ID does not exist" containerID="bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.468102 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801"} err="failed to get container status \"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801\": rpc error: code = NotFound desc = could not find container \"bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801\": container with ID starting with bdaa2c60cac9b650e707167ec4fe9270075cbc923f604edd23e571313c6ee801 not found: ID does not exist" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.468129 4610 scope.go:117] "RemoveContainer" containerID="5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841" Oct 06 09:15:18 crc kubenswrapper[4610]: E1006 09:15:18.468647 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841\": container with ID starting with 5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841 not found: ID does not exist" containerID="5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.468696 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841"} err="failed to get container status \"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841\": rpc error: code = NotFound desc = could not find container \"5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841\": container with ID starting with 5b6e732fd092276d0e2571d27a1bf2f976c806315b72d82e196f30766ecf5841 not found: ID does not exist" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.468728 4610 scope.go:117] "RemoveContainer" containerID="168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb" Oct 06 09:15:18 crc kubenswrapper[4610]: E1006 09:15:18.468980 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb\": container with ID starting with 168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb not found: ID does not exist" containerID="168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb" Oct 06 09:15:18 crc kubenswrapper[4610]: I1006 09:15:18.469015 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb"} err="failed to get container status \"168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb\": rpc error: code = NotFound desc = could not find container \"168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb\": container with ID starting with 168b12e81cb6bed0df523d5f4ad066b5464633cd84f1361e49de1ccf24530cfb not found: ID does not exist" Oct 06 09:15:19 crc kubenswrapper[4610]: I1006 09:15:19.090138 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" path="/var/lib/kubelet/pods/7b1a8891-4621-4216-b50b-63af59fd908a/volumes" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.582380 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:36 crc kubenswrapper[4610]: E1006 09:15:36.584393 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="extract-utilities" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584427 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="extract-utilities" Oct 06 09:15:36 crc kubenswrapper[4610]: E1006 09:15:36.584447 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584458 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" Oct 06 09:15:36 crc kubenswrapper[4610]: E1006 09:15:36.584492 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b83115b-6506-4f12-8b07-a850a423dc9b" containerName="collect-profiles" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584501 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b83115b-6506-4f12-8b07-a850a423dc9b" containerName="collect-profiles" Oct 06 09:15:36 crc kubenswrapper[4610]: E1006 09:15:36.584522 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="extract-content" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584531 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="extract-content" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584774 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1a8891-4621-4216-b50b-63af59fd908a" containerName="registry-server" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.584799 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b83115b-6506-4f12-8b07-a850a423dc9b" containerName="collect-profiles" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.586596 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.604888 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.772013 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.772403 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4s2z\" (UniqueName: \"kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.772631 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.874931 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.875013 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.875111 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4s2z\" (UniqueName: \"kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.875920 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.875950 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.894771 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4s2z\" (UniqueName: \"kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z\") pod \"certified-operators-qgbmf\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:36 crc kubenswrapper[4610]: I1006 09:15:36.947121 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:37 crc kubenswrapper[4610]: I1006 09:15:37.560733 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:38 crc kubenswrapper[4610]: I1006 09:15:38.561172 4610 generic.go:334] "Generic (PLEG): container finished" podID="3b4cf406-a845-4ddf-a862-4671287522aa" containerID="d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6" exitCode=0 Oct 06 09:15:38 crc kubenswrapper[4610]: I1006 09:15:38.561269 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerDied","Data":"d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6"} Oct 06 09:15:38 crc kubenswrapper[4610]: I1006 09:15:38.561546 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerStarted","Data":"f7d360a3de08316638a8d96c67bb5d4b719dd3ac4c401b47213026c60eb8a274"} Oct 06 09:15:40 crc kubenswrapper[4610]: I1006 09:15:40.582200 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerStarted","Data":"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676"} Oct 06 09:15:40 crc kubenswrapper[4610]: I1006 09:15:40.586973 4610 generic.go:334] "Generic (PLEG): container finished" podID="b87a175d-5d06-4825-981c-ed2cf97fb652" containerID="8565817acccfbb8869b72d6868df94767462ff692a63af0ed9ebf0b1a9b7e8b9" exitCode=0 Oct 06 09:15:40 crc kubenswrapper[4610]: I1006 09:15:40.587015 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" event={"ID":"b87a175d-5d06-4825-981c-ed2cf97fb652","Type":"ContainerDied","Data":"8565817acccfbb8869b72d6868df94767462ff692a63af0ed9ebf0b1a9b7e8b9"} Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.016786 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.177267 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key\") pod \"b87a175d-5d06-4825-981c-ed2cf97fb652\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.177669 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory\") pod \"b87a175d-5d06-4825-981c-ed2cf97fb652\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.177807 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7g7d\" (UniqueName: \"kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d\") pod \"b87a175d-5d06-4825-981c-ed2cf97fb652\" (UID: \"b87a175d-5d06-4825-981c-ed2cf97fb652\") " Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.187308 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d" (OuterVolumeSpecName: "kube-api-access-t7g7d") pod "b87a175d-5d06-4825-981c-ed2cf97fb652" (UID: "b87a175d-5d06-4825-981c-ed2cf97fb652"). InnerVolumeSpecName "kube-api-access-t7g7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.203613 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b87a175d-5d06-4825-981c-ed2cf97fb652" (UID: "b87a175d-5d06-4825-981c-ed2cf97fb652"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.211062 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory" (OuterVolumeSpecName: "inventory") pod "b87a175d-5d06-4825-981c-ed2cf97fb652" (UID: "b87a175d-5d06-4825-981c-ed2cf97fb652"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.281733 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.281773 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a175d-5d06-4825-981c-ed2cf97fb652-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.281786 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7g7d\" (UniqueName: \"kubernetes.io/projected/b87a175d-5d06-4825-981c-ed2cf97fb652-kube-api-access-t7g7d\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.607236 4610 generic.go:334] "Generic (PLEG): container finished" podID="3b4cf406-a845-4ddf-a862-4671287522aa" containerID="1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676" exitCode=0 Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.607327 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerDied","Data":"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676"} Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.610004 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" event={"ID":"b87a175d-5d06-4825-981c-ed2cf97fb652","Type":"ContainerDied","Data":"82d550e035a9f8dc0326c65b2529a5b87cc0d1c4323dd71f21354768c64bc198"} Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.610177 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82d550e035a9f8dc0326c65b2529a5b87cc0d1c4323dd71f21354768c64bc198" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.610100 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-687wd" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.728167 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4l44"] Oct 06 09:15:42 crc kubenswrapper[4610]: E1006 09:15:42.728717 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87a175d-5d06-4825-981c-ed2cf97fb652" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.728742 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87a175d-5d06-4825-981c-ed2cf97fb652" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.729077 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87a175d-5d06-4825-981c-ed2cf97fb652" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.730162 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.736977 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.737024 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.737308 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.738530 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.755647 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4l44"] Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.791014 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.791295 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8gn\" (UniqueName: \"kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.791365 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.892876 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.892981 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8gn\" (UniqueName: \"kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.893008 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.897852 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.909775 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:42 crc kubenswrapper[4610]: I1006 09:15:42.912333 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8gn\" (UniqueName: \"kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn\") pod \"ssh-known-hosts-edpm-deployment-p4l44\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:43 crc kubenswrapper[4610]: I1006 09:15:43.047908 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:43 crc kubenswrapper[4610]: I1006 09:15:43.597351 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4l44"] Oct 06 09:15:43 crc kubenswrapper[4610]: I1006 09:15:43.630332 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerStarted","Data":"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148"} Oct 06 09:15:43 crc kubenswrapper[4610]: I1006 09:15:43.631943 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" event={"ID":"ee7bec6c-22cc-448e-8939-798d80db2045","Type":"ContainerStarted","Data":"c738007f94bb14826951e9a55deda5dcdb6e2a28d28724abada448f454417dd9"} Oct 06 09:15:43 crc kubenswrapper[4610]: I1006 09:15:43.665805 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qgbmf" podStartSLOduration=3.060400831 podStartE2EDuration="7.665785136s" podCreationTimestamp="2025-10-06 09:15:36 +0000 UTC" firstStartedPulling="2025-10-06 09:15:38.563002663 +0000 UTC m=+2070.278056051" lastFinishedPulling="2025-10-06 09:15:43.168386968 +0000 UTC m=+2074.883440356" observedRunningTime="2025-10-06 09:15:43.647101129 +0000 UTC m=+2075.362154527" watchObservedRunningTime="2025-10-06 09:15:43.665785136 +0000 UTC m=+2075.380838524" Oct 06 09:15:44 crc kubenswrapper[4610]: I1006 09:15:44.641791 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" event={"ID":"ee7bec6c-22cc-448e-8939-798d80db2045","Type":"ContainerStarted","Data":"a68dd36624083856f9729fbca59bf9b39b038d5adde7369778aaed106d3a5792"} Oct 06 09:15:44 crc kubenswrapper[4610]: I1006 09:15:44.661964 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" podStartSLOduration=2.493371934 podStartE2EDuration="2.66194616s" podCreationTimestamp="2025-10-06 09:15:42 +0000 UTC" firstStartedPulling="2025-10-06 09:15:43.615752135 +0000 UTC m=+2075.330805523" lastFinishedPulling="2025-10-06 09:15:43.784326361 +0000 UTC m=+2075.499379749" observedRunningTime="2025-10-06 09:15:44.659435833 +0000 UTC m=+2076.374489221" watchObservedRunningTime="2025-10-06 09:15:44.66194616 +0000 UTC m=+2076.376999548" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.469736 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.469825 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.469900 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.471139 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.471239 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87" gracePeriod=600 Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.665961 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87" exitCode=0 Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.666013 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87"} Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.666068 4610 scope.go:117] "RemoveContainer" containerID="4355728cde6282e17e826f61331953f538f3a328d8d6ebab47258aceef549a95" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.947558 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:46 crc kubenswrapper[4610]: I1006 09:15:46.947719 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:47 crc kubenswrapper[4610]: I1006 09:15:47.020369 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:47 crc kubenswrapper[4610]: I1006 09:15:47.677025 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31"} Oct 06 09:15:48 crc kubenswrapper[4610]: I1006 09:15:48.731024 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:48 crc kubenswrapper[4610]: I1006 09:15:48.786448 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:50 crc kubenswrapper[4610]: I1006 09:15:50.706916 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qgbmf" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="registry-server" containerID="cri-o://ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148" gracePeriod=2 Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.178457 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.264283 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4s2z\" (UniqueName: \"kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z\") pod \"3b4cf406-a845-4ddf-a862-4671287522aa\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.264350 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities\") pod \"3b4cf406-a845-4ddf-a862-4671287522aa\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.264445 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content\") pod \"3b4cf406-a845-4ddf-a862-4671287522aa\" (UID: \"3b4cf406-a845-4ddf-a862-4671287522aa\") " Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.267820 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities" (OuterVolumeSpecName: "utilities") pod "3b4cf406-a845-4ddf-a862-4671287522aa" (UID: "3b4cf406-a845-4ddf-a862-4671287522aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.274140 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z" (OuterVolumeSpecName: "kube-api-access-r4s2z") pod "3b4cf406-a845-4ddf-a862-4671287522aa" (UID: "3b4cf406-a845-4ddf-a862-4671287522aa"). InnerVolumeSpecName "kube-api-access-r4s2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.370661 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4s2z\" (UniqueName: \"kubernetes.io/projected/3b4cf406-a845-4ddf-a862-4671287522aa-kube-api-access-r4s2z\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.370898 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.564851 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b4cf406-a845-4ddf-a862-4671287522aa" (UID: "3b4cf406-a845-4ddf-a862-4671287522aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.578984 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4cf406-a845-4ddf-a862-4671287522aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.721616 4610 generic.go:334] "Generic (PLEG): container finished" podID="3b4cf406-a845-4ddf-a862-4671287522aa" containerID="ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148" exitCode=0 Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.721727 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerDied","Data":"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148"} Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.721749 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgbmf" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.722952 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgbmf" event={"ID":"3b4cf406-a845-4ddf-a862-4671287522aa","Type":"ContainerDied","Data":"f7d360a3de08316638a8d96c67bb5d4b719dd3ac4c401b47213026c60eb8a274"} Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.722998 4610 scope.go:117] "RemoveContainer" containerID="ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.726377 4610 generic.go:334] "Generic (PLEG): container finished" podID="ee7bec6c-22cc-448e-8939-798d80db2045" containerID="a68dd36624083856f9729fbca59bf9b39b038d5adde7369778aaed106d3a5792" exitCode=0 Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.726429 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" event={"ID":"ee7bec6c-22cc-448e-8939-798d80db2045","Type":"ContainerDied","Data":"a68dd36624083856f9729fbca59bf9b39b038d5adde7369778aaed106d3a5792"} Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.759590 4610 scope.go:117] "RemoveContainer" containerID="1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.792030 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.796422 4610 scope.go:117] "RemoveContainer" containerID="d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.799721 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qgbmf"] Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.828582 4610 scope.go:117] "RemoveContainer" containerID="ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148" Oct 06 09:15:51 crc kubenswrapper[4610]: E1006 09:15:51.829139 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148\": container with ID starting with ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148 not found: ID does not exist" containerID="ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.829182 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148"} err="failed to get container status \"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148\": rpc error: code = NotFound desc = could not find container \"ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148\": container with ID starting with ff18b55255d38edb3fe95d2a05ab21f07f95f0d51c3cbfea8964cf273c386148 not found: ID does not exist" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.829210 4610 scope.go:117] "RemoveContainer" containerID="1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676" Oct 06 09:15:51 crc kubenswrapper[4610]: E1006 09:15:51.829823 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676\": container with ID starting with 1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676 not found: ID does not exist" containerID="1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.829855 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676"} err="failed to get container status \"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676\": rpc error: code = NotFound desc = could not find container \"1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676\": container with ID starting with 1a80095097b5dccf2bacd644841eb9cde571716072ed79e51c980b5ed0111676 not found: ID does not exist" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.829880 4610 scope.go:117] "RemoveContainer" containerID="d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6" Oct 06 09:15:51 crc kubenswrapper[4610]: E1006 09:15:51.830233 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6\": container with ID starting with d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6 not found: ID does not exist" containerID="d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6" Oct 06 09:15:51 crc kubenswrapper[4610]: I1006 09:15:51.830326 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6"} err="failed to get container status \"d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6\": rpc error: code = NotFound desc = could not find container \"d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6\": container with ID starting with d48827cb46ebb9e47365843b206a5ff0f80e6474b197aaf41333d123e4d04de6 not found: ID does not exist" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.112930 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" path="/var/lib/kubelet/pods/3b4cf406-a845-4ddf-a862-4671287522aa/volumes" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.294146 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.410456 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam\") pod \"ee7bec6c-22cc-448e-8939-798d80db2045\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.410922 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq8gn\" (UniqueName: \"kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn\") pod \"ee7bec6c-22cc-448e-8939-798d80db2045\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.410992 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0\") pod \"ee7bec6c-22cc-448e-8939-798d80db2045\" (UID: \"ee7bec6c-22cc-448e-8939-798d80db2045\") " Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.437259 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn" (OuterVolumeSpecName: "kube-api-access-vq8gn") pod "ee7bec6c-22cc-448e-8939-798d80db2045" (UID: "ee7bec6c-22cc-448e-8939-798d80db2045"). InnerVolumeSpecName "kube-api-access-vq8gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.457184 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ee7bec6c-22cc-448e-8939-798d80db2045" (UID: "ee7bec6c-22cc-448e-8939-798d80db2045"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.472515 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee7bec6c-22cc-448e-8939-798d80db2045" (UID: "ee7bec6c-22cc-448e-8939-798d80db2045"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.513667 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.513703 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq8gn\" (UniqueName: \"kubernetes.io/projected/ee7bec6c-22cc-448e-8939-798d80db2045-kube-api-access-vq8gn\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.513714 4610 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee7bec6c-22cc-448e-8939-798d80db2045-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.748146 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" event={"ID":"ee7bec6c-22cc-448e-8939-798d80db2045","Type":"ContainerDied","Data":"c738007f94bb14826951e9a55deda5dcdb6e2a28d28724abada448f454417dd9"} Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.748424 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c738007f94bb14826951e9a55deda5dcdb6e2a28d28724abada448f454417dd9" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.748486 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4l44" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.843508 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht"] Oct 06 09:15:53 crc kubenswrapper[4610]: E1006 09:15:53.843937 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="extract-content" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.843958 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="extract-content" Oct 06 09:15:53 crc kubenswrapper[4610]: E1006 09:15:53.843985 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="extract-utilities" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.843993 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="extract-utilities" Oct 06 09:15:53 crc kubenswrapper[4610]: E1006 09:15:53.844031 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7bec6c-22cc-448e-8939-798d80db2045" containerName="ssh-known-hosts-edpm-deployment" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.844066 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7bec6c-22cc-448e-8939-798d80db2045" containerName="ssh-known-hosts-edpm-deployment" Oct 06 09:15:53 crc kubenswrapper[4610]: E1006 09:15:53.844086 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="registry-server" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.844096 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="registry-server" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.844382 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7bec6c-22cc-448e-8939-798d80db2045" containerName="ssh-known-hosts-edpm-deployment" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.844412 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4cf406-a845-4ddf-a862-4671287522aa" containerName="registry-server" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.845383 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.847117 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.850707 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.859247 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.859533 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:15:53 crc kubenswrapper[4610]: I1006 09:15:53.860762 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht"] Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.023303 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.023429 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.023477 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxtx6\" (UniqueName: \"kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.124963 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxtx6\" (UniqueName: \"kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.125113 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.125234 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.130741 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.130818 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.149389 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxtx6\" (UniqueName: \"kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fgvht\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.170417 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.730254 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht"] Oct 06 09:15:54 crc kubenswrapper[4610]: I1006 09:15:54.765883 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" event={"ID":"5c30c0cb-9027-4935-bca1-0debc398c091","Type":"ContainerStarted","Data":"d8c7f349ffdbabfb1629cf879636fa4dcb76e0314c0d8265e98a136bf42148b9"} Oct 06 09:15:55 crc kubenswrapper[4610]: I1006 09:15:55.778581 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" event={"ID":"5c30c0cb-9027-4935-bca1-0debc398c091","Type":"ContainerStarted","Data":"7198b01a08edbfb612c5eecbbe599186032395064844fb041f0bdffd668ef847"} Oct 06 09:15:55 crc kubenswrapper[4610]: I1006 09:15:55.809869 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" podStartSLOduration=2.607958951 podStartE2EDuration="2.809845205s" podCreationTimestamp="2025-10-06 09:15:53 +0000 UTC" firstStartedPulling="2025-10-06 09:15:54.7369773 +0000 UTC m=+2086.452030688" lastFinishedPulling="2025-10-06 09:15:54.938863554 +0000 UTC m=+2086.653916942" observedRunningTime="2025-10-06 09:15:55.807160934 +0000 UTC m=+2087.522214322" watchObservedRunningTime="2025-10-06 09:15:55.809845205 +0000 UTC m=+2087.524898633" Oct 06 09:16:04 crc kubenswrapper[4610]: I1006 09:16:04.864621 4610 generic.go:334] "Generic (PLEG): container finished" podID="5c30c0cb-9027-4935-bca1-0debc398c091" containerID="7198b01a08edbfb612c5eecbbe599186032395064844fb041f0bdffd668ef847" exitCode=0 Oct 06 09:16:04 crc kubenswrapper[4610]: I1006 09:16:04.864670 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" event={"ID":"5c30c0cb-9027-4935-bca1-0debc398c091","Type":"ContainerDied","Data":"7198b01a08edbfb612c5eecbbe599186032395064844fb041f0bdffd668ef847"} Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.361992 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.386013 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key\") pod \"5c30c0cb-9027-4935-bca1-0debc398c091\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.386300 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory\") pod \"5c30c0cb-9027-4935-bca1-0debc398c091\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.386462 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxtx6\" (UniqueName: \"kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6\") pod \"5c30c0cb-9027-4935-bca1-0debc398c091\" (UID: \"5c30c0cb-9027-4935-bca1-0debc398c091\") " Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.399315 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6" (OuterVolumeSpecName: "kube-api-access-vxtx6") pod "5c30c0cb-9027-4935-bca1-0debc398c091" (UID: "5c30c0cb-9027-4935-bca1-0debc398c091"). InnerVolumeSpecName "kube-api-access-vxtx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.446674 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c30c0cb-9027-4935-bca1-0debc398c091" (UID: "5c30c0cb-9027-4935-bca1-0debc398c091"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.460011 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory" (OuterVolumeSpecName: "inventory") pod "5c30c0cb-9027-4935-bca1-0debc398c091" (UID: "5c30c0cb-9027-4935-bca1-0debc398c091"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.489490 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.489524 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxtx6\" (UniqueName: \"kubernetes.io/projected/5c30c0cb-9027-4935-bca1-0debc398c091-kube-api-access-vxtx6\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.489538 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c30c0cb-9027-4935-bca1-0debc398c091-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.901758 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" event={"ID":"5c30c0cb-9027-4935-bca1-0debc398c091","Type":"ContainerDied","Data":"d8c7f349ffdbabfb1629cf879636fa4dcb76e0314c0d8265e98a136bf42148b9"} Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.901810 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8c7f349ffdbabfb1629cf879636fa4dcb76e0314c0d8265e98a136bf42148b9" Oct 06 09:16:06 crc kubenswrapper[4610]: I1006 09:16:06.902089 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fgvht" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.042124 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw"] Oct 06 09:16:07 crc kubenswrapper[4610]: E1006 09:16:07.042490 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c30c0cb-9027-4935-bca1-0debc398c091" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.042507 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c30c0cb-9027-4935-bca1-0debc398c091" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.042693 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c30c0cb-9027-4935-bca1-0debc398c091" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.043259 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.045940 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.046735 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.046835 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.047909 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.055484 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw"] Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.104498 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.104588 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.104728 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72r94\" (UniqueName: \"kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.206235 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72r94\" (UniqueName: \"kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.206310 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.206353 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.209766 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.210187 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.226741 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72r94\" (UniqueName: \"kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.360884 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:07 crc kubenswrapper[4610]: I1006 09:16:07.932993 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw"] Oct 06 09:16:08 crc kubenswrapper[4610]: I1006 09:16:08.918272 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" event={"ID":"f1784103-7612-4a23-9135-eb81df0fe2ce","Type":"ContainerStarted","Data":"92a4cb042228436c663f3217dd75ef34c3357cf860b1403eb18e2f4e17b434c5"} Oct 06 09:16:08 crc kubenswrapper[4610]: I1006 09:16:08.918563 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" event={"ID":"f1784103-7612-4a23-9135-eb81df0fe2ce","Type":"ContainerStarted","Data":"21ffa44af415b803e996eb515cce50808e35c443a6229e96ce9c782f906d2ffa"} Oct 06 09:16:08 crc kubenswrapper[4610]: I1006 09:16:08.940803 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" podStartSLOduration=1.530542262 podStartE2EDuration="1.940783961s" podCreationTimestamp="2025-10-06 09:16:07 +0000 UTC" firstStartedPulling="2025-10-06 09:16:07.961467826 +0000 UTC m=+2099.676521214" lastFinishedPulling="2025-10-06 09:16:08.371709505 +0000 UTC m=+2100.086762913" observedRunningTime="2025-10-06 09:16:08.934973247 +0000 UTC m=+2100.650026645" watchObservedRunningTime="2025-10-06 09:16:08.940783961 +0000 UTC m=+2100.655837349" Oct 06 09:16:19 crc kubenswrapper[4610]: I1006 09:16:19.011318 4610 generic.go:334] "Generic (PLEG): container finished" podID="f1784103-7612-4a23-9135-eb81df0fe2ce" containerID="92a4cb042228436c663f3217dd75ef34c3357cf860b1403eb18e2f4e17b434c5" exitCode=0 Oct 06 09:16:19 crc kubenswrapper[4610]: I1006 09:16:19.011495 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" event={"ID":"f1784103-7612-4a23-9135-eb81df0fe2ce","Type":"ContainerDied","Data":"92a4cb042228436c663f3217dd75ef34c3357cf860b1403eb18e2f4e17b434c5"} Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.523325 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.590305 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory\") pod \"f1784103-7612-4a23-9135-eb81df0fe2ce\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.590394 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72r94\" (UniqueName: \"kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94\") pod \"f1784103-7612-4a23-9135-eb81df0fe2ce\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.590551 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key\") pod \"f1784103-7612-4a23-9135-eb81df0fe2ce\" (UID: \"f1784103-7612-4a23-9135-eb81df0fe2ce\") " Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.615186 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94" (OuterVolumeSpecName: "kube-api-access-72r94") pod "f1784103-7612-4a23-9135-eb81df0fe2ce" (UID: "f1784103-7612-4a23-9135-eb81df0fe2ce"). InnerVolumeSpecName "kube-api-access-72r94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.621542 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1784103-7612-4a23-9135-eb81df0fe2ce" (UID: "f1784103-7612-4a23-9135-eb81df0fe2ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.629216 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory" (OuterVolumeSpecName: "inventory") pod "f1784103-7612-4a23-9135-eb81df0fe2ce" (UID: "f1784103-7612-4a23-9135-eb81df0fe2ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.693114 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.693167 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1784103-7612-4a23-9135-eb81df0fe2ce-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:20 crc kubenswrapper[4610]: I1006 09:16:20.693178 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72r94\" (UniqueName: \"kubernetes.io/projected/f1784103-7612-4a23-9135-eb81df0fe2ce-kube-api-access-72r94\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.033449 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" event={"ID":"f1784103-7612-4a23-9135-eb81df0fe2ce","Type":"ContainerDied","Data":"21ffa44af415b803e996eb515cce50808e35c443a6229e96ce9c782f906d2ffa"} Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.033511 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ffa44af415b803e996eb515cce50808e35c443a6229e96ce9c782f906d2ffa" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.033904 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.179729 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p"] Oct 06 09:16:21 crc kubenswrapper[4610]: E1006 09:16:21.180232 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1784103-7612-4a23-9135-eb81df0fe2ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.180252 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1784103-7612-4a23-9135-eb81df0fe2ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.180529 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1784103-7612-4a23-9135-eb81df0fe2ce" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.181349 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.184032 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.184275 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.184433 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.185033 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.185146 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.185460 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.185700 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.196361 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.201478 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p"] Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304611 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304656 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304763 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304808 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304911 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.304955 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305176 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305218 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4nw\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305472 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305548 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305591 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305657 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305712 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.305758 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407308 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407395 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407430 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4nw\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407466 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407490 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407509 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407557 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407580 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407606 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407621 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407654 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407673 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.407715 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.410906 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.411887 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.415619 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.418140 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.418419 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.419231 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.419809 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.419892 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.420317 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.420965 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.423782 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.423811 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.424299 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.427972 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4nw\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ps99p\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:21 crc kubenswrapper[4610]: I1006 09:16:21.501450 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:16:22 crc kubenswrapper[4610]: I1006 09:16:22.103432 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p"] Oct 06 09:16:23 crc kubenswrapper[4610]: I1006 09:16:23.056525 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" event={"ID":"7e49b85b-bbed-4c13-b513-3d61369aa3c0","Type":"ContainerStarted","Data":"da8ad801cf50d7497be5922d6aa37b15f3e22a2132a423fb91c41ee3c65e159c"} Oct 06 09:16:23 crc kubenswrapper[4610]: I1006 09:16:23.056882 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" event={"ID":"7e49b85b-bbed-4c13-b513-3d61369aa3c0","Type":"ContainerStarted","Data":"efd42cfa2a1c932a1a9fbc291e3633a2abf44b0cb349542857c4bb091b14a4e2"} Oct 06 09:16:23 crc kubenswrapper[4610]: I1006 09:16:23.081019 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" podStartSLOduration=1.9322613199999998 podStartE2EDuration="2.080997699s" podCreationTimestamp="2025-10-06 09:16:21 +0000 UTC" firstStartedPulling="2025-10-06 09:16:22.098128719 +0000 UTC m=+2113.813182107" lastFinishedPulling="2025-10-06 09:16:22.246865108 +0000 UTC m=+2113.961918486" observedRunningTime="2025-10-06 09:16:23.079640963 +0000 UTC m=+2114.794694361" watchObservedRunningTime="2025-10-06 09:16:23.080997699 +0000 UTC m=+2114.796051097" Oct 06 09:16:28 crc kubenswrapper[4610]: I1006 09:16:28.954739 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:28 crc kubenswrapper[4610]: I1006 09:16:28.972025 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.015495 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.082260 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdvr\" (UniqueName: \"kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.082404 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.082426 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.184240 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.184288 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.184373 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdvr\" (UniqueName: \"kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.185830 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.186060 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.209127 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdvr\" (UniqueName: \"kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr\") pod \"redhat-marketplace-hr7gq\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.332413 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:29 crc kubenswrapper[4610]: I1006 09:16:29.850267 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:30 crc kubenswrapper[4610]: I1006 09:16:30.114102 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerStarted","Data":"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b"} Oct 06 09:16:30 crc kubenswrapper[4610]: I1006 09:16:30.114142 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerStarted","Data":"9864eb927cc42df8837ddbf386aced6cab47452bab9f8b8f316d8f009059931e"} Oct 06 09:16:31 crc kubenswrapper[4610]: I1006 09:16:31.133247 4610 generic.go:334] "Generic (PLEG): container finished" podID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerID="c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b" exitCode=0 Oct 06 09:16:31 crc kubenswrapper[4610]: I1006 09:16:31.133310 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerDied","Data":"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b"} Oct 06 09:16:32 crc kubenswrapper[4610]: I1006 09:16:32.145825 4610 generic.go:334] "Generic (PLEG): container finished" podID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerID="3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b" exitCode=0 Oct 06 09:16:32 crc kubenswrapper[4610]: I1006 09:16:32.145875 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerDied","Data":"3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b"} Oct 06 09:16:33 crc kubenswrapper[4610]: I1006 09:16:33.158779 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerStarted","Data":"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1"} Oct 06 09:16:33 crc kubenswrapper[4610]: I1006 09:16:33.186695 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hr7gq" podStartSLOduration=2.718413053 podStartE2EDuration="5.186675387s" podCreationTimestamp="2025-10-06 09:16:28 +0000 UTC" firstStartedPulling="2025-10-06 09:16:30.116821841 +0000 UTC m=+2121.831875229" lastFinishedPulling="2025-10-06 09:16:32.585084175 +0000 UTC m=+2124.300137563" observedRunningTime="2025-10-06 09:16:33.17738767 +0000 UTC m=+2124.892441098" watchObservedRunningTime="2025-10-06 09:16:33.186675387 +0000 UTC m=+2124.901728765" Oct 06 09:16:39 crc kubenswrapper[4610]: I1006 09:16:39.333133 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:39 crc kubenswrapper[4610]: I1006 09:16:39.333854 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:39 crc kubenswrapper[4610]: I1006 09:16:39.386459 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:40 crc kubenswrapper[4610]: I1006 09:16:40.272023 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:40 crc kubenswrapper[4610]: I1006 09:16:40.324513 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.247390 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hr7gq" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="registry-server" containerID="cri-o://8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1" gracePeriod=2 Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.696863 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.858430 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content\") pod \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.858496 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities\") pod \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.858566 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgdvr\" (UniqueName: \"kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr\") pod \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\" (UID: \"a59ada84-9d6d-4b25-9723-8cd0c60059fe\") " Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.859417 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities" (OuterVolumeSpecName: "utilities") pod "a59ada84-9d6d-4b25-9723-8cd0c60059fe" (UID: "a59ada84-9d6d-4b25-9723-8cd0c60059fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.864699 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr" (OuterVolumeSpecName: "kube-api-access-bgdvr") pod "a59ada84-9d6d-4b25-9723-8cd0c60059fe" (UID: "a59ada84-9d6d-4b25-9723-8cd0c60059fe"). InnerVolumeSpecName "kube-api-access-bgdvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.877239 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a59ada84-9d6d-4b25-9723-8cd0c60059fe" (UID: "a59ada84-9d6d-4b25-9723-8cd0c60059fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.960474 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.960507 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a59ada84-9d6d-4b25-9723-8cd0c60059fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:42 crc kubenswrapper[4610]: I1006 09:16:42.960516 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgdvr\" (UniqueName: \"kubernetes.io/projected/a59ada84-9d6d-4b25-9723-8cd0c60059fe-kube-api-access-bgdvr\") on node \"crc\" DevicePath \"\"" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.256486 4610 generic.go:334] "Generic (PLEG): container finished" podID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerID="8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1" exitCode=0 Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.256523 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerDied","Data":"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1"} Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.256550 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hr7gq" event={"ID":"a59ada84-9d6d-4b25-9723-8cd0c60059fe","Type":"ContainerDied","Data":"9864eb927cc42df8837ddbf386aced6cab47452bab9f8b8f316d8f009059931e"} Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.256566 4610 scope.go:117] "RemoveContainer" containerID="8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.256685 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hr7gq" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.282466 4610 scope.go:117] "RemoveContainer" containerID="3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.285139 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.290822 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hr7gq"] Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.311494 4610 scope.go:117] "RemoveContainer" containerID="c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.342394 4610 scope.go:117] "RemoveContainer" containerID="8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1" Oct 06 09:16:43 crc kubenswrapper[4610]: E1006 09:16:43.342842 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1\": container with ID starting with 8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1 not found: ID does not exist" containerID="8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.342882 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1"} err="failed to get container status \"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1\": rpc error: code = NotFound desc = could not find container \"8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1\": container with ID starting with 8ea006bb182c293149a912bd86c3d85977c510b1b1aa3d2ba8518c1bb07bf8d1 not found: ID does not exist" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.342917 4610 scope.go:117] "RemoveContainer" containerID="3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b" Oct 06 09:16:43 crc kubenswrapper[4610]: E1006 09:16:43.343399 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b\": container with ID starting with 3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b not found: ID does not exist" containerID="3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.343425 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b"} err="failed to get container status \"3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b\": rpc error: code = NotFound desc = could not find container \"3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b\": container with ID starting with 3b7c596fad3b6fb8648db636899924dddc909b8a46ccf462d9106e91e161561b not found: ID does not exist" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.343440 4610 scope.go:117] "RemoveContainer" containerID="c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b" Oct 06 09:16:43 crc kubenswrapper[4610]: E1006 09:16:43.343832 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b\": container with ID starting with c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b not found: ID does not exist" containerID="c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b" Oct 06 09:16:43 crc kubenswrapper[4610]: I1006 09:16:43.343868 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b"} err="failed to get container status \"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b\": rpc error: code = NotFound desc = could not find container \"c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b\": container with ID starting with c794325544d2d53b0a384bc4d373c5aa56fffcc37c5f03d1ab6fc43860bb497b not found: ID does not exist" Oct 06 09:16:45 crc kubenswrapper[4610]: I1006 09:16:45.083120 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" path="/var/lib/kubelet/pods/a59ada84-9d6d-4b25-9723-8cd0c60059fe/volumes" Oct 06 09:17:07 crc kubenswrapper[4610]: I1006 09:17:07.501814 4610 generic.go:334] "Generic (PLEG): container finished" podID="7e49b85b-bbed-4c13-b513-3d61369aa3c0" containerID="da8ad801cf50d7497be5922d6aa37b15f3e22a2132a423fb91c41ee3c65e159c" exitCode=0 Oct 06 09:17:07 crc kubenswrapper[4610]: I1006 09:17:07.501922 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" event={"ID":"7e49b85b-bbed-4c13-b513-3d61369aa3c0","Type":"ContainerDied","Data":"da8ad801cf50d7497be5922d6aa37b15f3e22a2132a423fb91c41ee3c65e159c"} Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.006223 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.052024 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.052168 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.052222 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.052309 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.052415 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054555 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054611 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs4nw\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054645 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054698 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054722 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054748 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054785 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054825 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.054912 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\" (UID: \"7e49b85b-bbed-4c13-b513-3d61369aa3c0\") " Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.061660 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.061999 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.065418 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.066663 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.066717 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.068760 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.069443 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw" (OuterVolumeSpecName: "kube-api-access-zs4nw") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "kube-api-access-zs4nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.070599 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.072326 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.080582 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.082463 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.090225 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.114079 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory" (OuterVolumeSpecName: "inventory") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.117874 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e49b85b-bbed-4c13-b513-3d61369aa3c0" (UID: "7e49b85b-bbed-4c13-b513-3d61369aa3c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.157708 4610 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.157956 4610 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158086 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158165 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158240 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs4nw\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-kube-api-access-zs4nw\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158324 4610 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158394 4610 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158466 4610 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158538 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158614 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158688 4610 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158757 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e49b85b-bbed-4c13-b513-3d61369aa3c0-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158849 4610 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.158930 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e49b85b-bbed-4c13-b513-3d61369aa3c0-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.529727 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" event={"ID":"7e49b85b-bbed-4c13-b513-3d61369aa3c0","Type":"ContainerDied","Data":"efd42cfa2a1c932a1a9fbc291e3633a2abf44b0cb349542857c4bb091b14a4e2"} Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.530402 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd42cfa2a1c932a1a9fbc291e3633a2abf44b0cb349542857c4bb091b14a4e2" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.530210 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ps99p" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.651714 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk"] Oct 06 09:17:09 crc kubenswrapper[4610]: E1006 09:17:09.652159 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="extract-content" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652180 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="extract-content" Oct 06 09:17:09 crc kubenswrapper[4610]: E1006 09:17:09.652198 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e49b85b-bbed-4c13-b513-3d61369aa3c0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652208 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e49b85b-bbed-4c13-b513-3d61369aa3c0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:17:09 crc kubenswrapper[4610]: E1006 09:17:09.652223 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="extract-utilities" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652232 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="extract-utilities" Oct 06 09:17:09 crc kubenswrapper[4610]: E1006 09:17:09.652248 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="registry-server" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652258 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="registry-server" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652486 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e49b85b-bbed-4c13-b513-3d61369aa3c0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.652525 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59ada84-9d6d-4b25-9723-8cd0c60059fe" containerName="registry-server" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.653213 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.658293 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.658393 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.661936 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.662888 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.665154 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.667429 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.667586 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.667815 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.667963 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.668018 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gsts\" (UniqueName: \"kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.728994 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk"] Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.770807 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.770953 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.771024 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.771179 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.771212 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gsts\" (UniqueName: \"kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.773560 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.798934 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.799546 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.799840 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.801754 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gsts\" (UniqueName: \"kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gbnnk\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:09 crc kubenswrapper[4610]: I1006 09:17:09.972010 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:17:10 crc kubenswrapper[4610]: I1006 09:17:10.492816 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk"] Oct 06 09:17:10 crc kubenswrapper[4610]: I1006 09:17:10.540770 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" event={"ID":"e9eecc46-8a50-486b-ae37-2ba0f62b5216","Type":"ContainerStarted","Data":"6e9e430ea35136c20833c37441d9d3188245e23b16d0f1547f22433bd2d14157"} Oct 06 09:17:11 crc kubenswrapper[4610]: I1006 09:17:11.556002 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" event={"ID":"e9eecc46-8a50-486b-ae37-2ba0f62b5216","Type":"ContainerStarted","Data":"81364f0e7548d1eceea9388c4c71709bc5a16121fc13b4e3de1166391da88fed"} Oct 06 09:17:11 crc kubenswrapper[4610]: I1006 09:17:11.586710 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" podStartSLOduration=2.401230133 podStartE2EDuration="2.58668931s" podCreationTimestamp="2025-10-06 09:17:09 +0000 UTC" firstStartedPulling="2025-10-06 09:17:10.504956639 +0000 UTC m=+2162.220010047" lastFinishedPulling="2025-10-06 09:17:10.690415826 +0000 UTC m=+2162.405469224" observedRunningTime="2025-10-06 09:17:11.580506095 +0000 UTC m=+2163.295559563" watchObservedRunningTime="2025-10-06 09:17:11.58668931 +0000 UTC m=+2163.301742698" Oct 06 09:17:36 crc kubenswrapper[4610]: I1006 09:17:36.983190 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:36 crc kubenswrapper[4610]: I1006 09:17:36.985890 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.012088 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.071198 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.071267 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k2dq\" (UniqueName: \"kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.071315 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.173256 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.173367 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k2dq\" (UniqueName: \"kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.173444 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.173774 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.173919 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.195997 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k2dq\" (UniqueName: \"kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq\") pod \"community-operators-s9shh\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:37 crc kubenswrapper[4610]: I1006 09:17:37.312736 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:38 crc kubenswrapper[4610]: I1006 09:17:38.008495 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:38 crc kubenswrapper[4610]: I1006 09:17:38.861014 4610 generic.go:334] "Generic (PLEG): container finished" podID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerID="56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0" exitCode=0 Oct 06 09:17:38 crc kubenswrapper[4610]: I1006 09:17:38.861453 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerDied","Data":"56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0"} Oct 06 09:17:38 crc kubenswrapper[4610]: I1006 09:17:38.861496 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerStarted","Data":"235282da24ffaafd7543437ed2d578690be58a84045ad02afe60795e037eeb45"} Oct 06 09:17:40 crc kubenswrapper[4610]: I1006 09:17:40.880184 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerStarted","Data":"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4"} Oct 06 09:17:41 crc kubenswrapper[4610]: I1006 09:17:41.897320 4610 generic.go:334] "Generic (PLEG): container finished" podID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerID="ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4" exitCode=0 Oct 06 09:17:41 crc kubenswrapper[4610]: I1006 09:17:41.897450 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerDied","Data":"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4"} Oct 06 09:17:42 crc kubenswrapper[4610]: I1006 09:17:42.913311 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerStarted","Data":"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18"} Oct 06 09:17:42 crc kubenswrapper[4610]: I1006 09:17:42.954135 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s9shh" podStartSLOduration=3.404212965 podStartE2EDuration="6.954105277s" podCreationTimestamp="2025-10-06 09:17:36 +0000 UTC" firstStartedPulling="2025-10-06 09:17:38.86450268 +0000 UTC m=+2190.579556068" lastFinishedPulling="2025-10-06 09:17:42.414394962 +0000 UTC m=+2194.129448380" observedRunningTime="2025-10-06 09:17:42.946495004 +0000 UTC m=+2194.661548432" watchObservedRunningTime="2025-10-06 09:17:42.954105277 +0000 UTC m=+2194.669158705" Oct 06 09:17:46 crc kubenswrapper[4610]: I1006 09:17:46.469464 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:17:46 crc kubenswrapper[4610]: I1006 09:17:46.469832 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:17:47 crc kubenswrapper[4610]: I1006 09:17:47.313840 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:47 crc kubenswrapper[4610]: I1006 09:17:47.313911 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:47 crc kubenswrapper[4610]: I1006 09:17:47.372148 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:48 crc kubenswrapper[4610]: I1006 09:17:48.000372 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:48 crc kubenswrapper[4610]: I1006 09:17:48.061849 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:49 crc kubenswrapper[4610]: I1006 09:17:49.973390 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s9shh" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="registry-server" containerID="cri-o://0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18" gracePeriod=2 Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.436110 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.553416 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k2dq\" (UniqueName: \"kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq\") pod \"ab31ee9b-dc1f-4326-be59-a97e197ae399\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.553911 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content\") pod \"ab31ee9b-dc1f-4326-be59-a97e197ae399\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.554016 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities\") pod \"ab31ee9b-dc1f-4326-be59-a97e197ae399\" (UID: \"ab31ee9b-dc1f-4326-be59-a97e197ae399\") " Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.555621 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities" (OuterVolumeSpecName: "utilities") pod "ab31ee9b-dc1f-4326-be59-a97e197ae399" (UID: "ab31ee9b-dc1f-4326-be59-a97e197ae399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.560293 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq" (OuterVolumeSpecName: "kube-api-access-6k2dq") pod "ab31ee9b-dc1f-4326-be59-a97e197ae399" (UID: "ab31ee9b-dc1f-4326-be59-a97e197ae399"). InnerVolumeSpecName "kube-api-access-6k2dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.631830 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab31ee9b-dc1f-4326-be59-a97e197ae399" (UID: "ab31ee9b-dc1f-4326-be59-a97e197ae399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.655874 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k2dq\" (UniqueName: \"kubernetes.io/projected/ab31ee9b-dc1f-4326-be59-a97e197ae399-kube-api-access-6k2dq\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.655910 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.655921 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab31ee9b-dc1f-4326-be59-a97e197ae399-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.984657 4610 generic.go:334] "Generic (PLEG): container finished" podID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerID="0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18" exitCode=0 Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.984739 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerDied","Data":"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18"} Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.984798 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9shh" event={"ID":"ab31ee9b-dc1f-4326-be59-a97e197ae399","Type":"ContainerDied","Data":"235282da24ffaafd7543437ed2d578690be58a84045ad02afe60795e037eeb45"} Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.984830 4610 scope.go:117] "RemoveContainer" containerID="0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18" Oct 06 09:17:50 crc kubenswrapper[4610]: I1006 09:17:50.985124 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9shh" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.041294 4610 scope.go:117] "RemoveContainer" containerID="ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.041490 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.050904 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s9shh"] Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.068319 4610 scope.go:117] "RemoveContainer" containerID="56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.092160 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" path="/var/lib/kubelet/pods/ab31ee9b-dc1f-4326-be59-a97e197ae399/volumes" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.115530 4610 scope.go:117] "RemoveContainer" containerID="0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18" Oct 06 09:17:51 crc kubenswrapper[4610]: E1006 09:17:51.116156 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18\": container with ID starting with 0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18 not found: ID does not exist" containerID="0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.116223 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18"} err="failed to get container status \"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18\": rpc error: code = NotFound desc = could not find container \"0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18\": container with ID starting with 0d142cf070290d9ce6dca79734f9ab1d246414cbb6e21e560c56b6a091104d18 not found: ID does not exist" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.116258 4610 scope.go:117] "RemoveContainer" containerID="ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4" Oct 06 09:17:51 crc kubenswrapper[4610]: E1006 09:17:51.116697 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4\": container with ID starting with ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4 not found: ID does not exist" containerID="ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.116725 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4"} err="failed to get container status \"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4\": rpc error: code = NotFound desc = could not find container \"ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4\": container with ID starting with ff14c482ebec648bdea841c8d2bee1027f473c13ebb822de913a3f84050cb3e4 not found: ID does not exist" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.116748 4610 scope.go:117] "RemoveContainer" containerID="56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0" Oct 06 09:17:51 crc kubenswrapper[4610]: E1006 09:17:51.117330 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0\": container with ID starting with 56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0 not found: ID does not exist" containerID="56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0" Oct 06 09:17:51 crc kubenswrapper[4610]: I1006 09:17:51.117359 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0"} err="failed to get container status \"56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0\": rpc error: code = NotFound desc = could not find container \"56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0\": container with ID starting with 56a187efb462bde9921213ed348ff14623ecc1c322748b657b1a2c2f5ad260f0 not found: ID does not exist" Oct 06 09:18:16 crc kubenswrapper[4610]: I1006 09:18:16.469012 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:18:16 crc kubenswrapper[4610]: I1006 09:18:16.469531 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:18:23 crc kubenswrapper[4610]: I1006 09:18:23.319237 4610 generic.go:334] "Generic (PLEG): container finished" podID="e9eecc46-8a50-486b-ae37-2ba0f62b5216" containerID="81364f0e7548d1eceea9388c4c71709bc5a16121fc13b4e3de1166391da88fed" exitCode=0 Oct 06 09:18:23 crc kubenswrapper[4610]: I1006 09:18:23.319342 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" event={"ID":"e9eecc46-8a50-486b-ae37-2ba0f62b5216","Type":"ContainerDied","Data":"81364f0e7548d1eceea9388c4c71709bc5a16121fc13b4e3de1166391da88fed"} Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.821332 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.958926 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key\") pod \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.959323 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory\") pod \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.959611 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle\") pod \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.959764 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gsts\" (UniqueName: \"kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts\") pod \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.959898 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0\") pod \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\" (UID: \"e9eecc46-8a50-486b-ae37-2ba0f62b5216\") " Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.968218 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts" (OuterVolumeSpecName: "kube-api-access-5gsts") pod "e9eecc46-8a50-486b-ae37-2ba0f62b5216" (UID: "e9eecc46-8a50-486b-ae37-2ba0f62b5216"). InnerVolumeSpecName "kube-api-access-5gsts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:18:24 crc kubenswrapper[4610]: I1006 09:18:24.969749 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e9eecc46-8a50-486b-ae37-2ba0f62b5216" (UID: "e9eecc46-8a50-486b-ae37-2ba0f62b5216"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:24.999552 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9eecc46-8a50-486b-ae37-2ba0f62b5216" (UID: "e9eecc46-8a50-486b-ae37-2ba0f62b5216"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.000101 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e9eecc46-8a50-486b-ae37-2ba0f62b5216" (UID: "e9eecc46-8a50-486b-ae37-2ba0f62b5216"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.005365 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory" (OuterVolumeSpecName: "inventory") pod "e9eecc46-8a50-486b-ae37-2ba0f62b5216" (UID: "e9eecc46-8a50-486b-ae37-2ba0f62b5216"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.063088 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gsts\" (UniqueName: \"kubernetes.io/projected/e9eecc46-8a50-486b-ae37-2ba0f62b5216-kube-api-access-5gsts\") on node \"crc\" DevicePath \"\"" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.063132 4610 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.063171 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.063186 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.063234 4610 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eecc46-8a50-486b-ae37-2ba0f62b5216-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.341614 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" event={"ID":"e9eecc46-8a50-486b-ae37-2ba0f62b5216","Type":"ContainerDied","Data":"6e9e430ea35136c20833c37441d9d3188245e23b16d0f1547f22433bd2d14157"} Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.341992 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e9e430ea35136c20833c37441d9d3188245e23b16d0f1547f22433bd2d14157" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.341689 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gbnnk" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.427082 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn"] Oct 06 09:18:25 crc kubenswrapper[4610]: E1006 09:18:25.427696 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="extract-utilities" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.427778 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="extract-utilities" Oct 06 09:18:25 crc kubenswrapper[4610]: E1006 09:18:25.427848 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9eecc46-8a50-486b-ae37-2ba0f62b5216" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.427902 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9eecc46-8a50-486b-ae37-2ba0f62b5216" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:18:25 crc kubenswrapper[4610]: E1006 09:18:25.427972 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="registry-server" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.428036 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="registry-server" Oct 06 09:18:25 crc kubenswrapper[4610]: E1006 09:18:25.428555 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="extract-content" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.428616 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="extract-content" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.428849 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9eecc46-8a50-486b-ae37-2ba0f62b5216" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.428912 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab31ee9b-dc1f-4326-be59-a97e197ae399" containerName="registry-server" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.429621 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.432123 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.432180 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.432681 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.432845 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.432851 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.433001 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.441699 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn"] Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.573557 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.573615 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.573814 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76wtt\" (UniqueName: \"kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.573912 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.573993 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.574027 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676126 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76wtt\" (UniqueName: \"kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676229 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676267 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676298 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676366 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.676394 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.681666 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.682486 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.683777 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.687746 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.687750 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.694326 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76wtt\" (UniqueName: \"kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:25 crc kubenswrapper[4610]: I1006 09:18:25.750598 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:18:26 crc kubenswrapper[4610]: I1006 09:18:26.269536 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn"] Oct 06 09:18:26 crc kubenswrapper[4610]: I1006 09:18:26.352554 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" event={"ID":"4288043c-e9b4-4c1c-8234-3f44be6fbc2f","Type":"ContainerStarted","Data":"f43f6928a81b5ac55367691115672c8d0d6765b7b9b0cc44d489a9e705d8a0f3"} Oct 06 09:18:27 crc kubenswrapper[4610]: I1006 09:18:27.366656 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" event={"ID":"4288043c-e9b4-4c1c-8234-3f44be6fbc2f","Type":"ContainerStarted","Data":"4df47465854a27ea871f038b067ac4c9ae31e3c8f239f83c1cf5bf853ff15947"} Oct 06 09:18:27 crc kubenswrapper[4610]: I1006 09:18:27.387026 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" podStartSLOduration=2.244020829 podStartE2EDuration="2.387004565s" podCreationTimestamp="2025-10-06 09:18:25 +0000 UTC" firstStartedPulling="2025-10-06 09:18:26.278011617 +0000 UTC m=+2237.993065005" lastFinishedPulling="2025-10-06 09:18:26.420995353 +0000 UTC m=+2238.136048741" observedRunningTime="2025-10-06 09:18:27.380441801 +0000 UTC m=+2239.095495189" watchObservedRunningTime="2025-10-06 09:18:27.387004565 +0000 UTC m=+2239.102057973" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.469751 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.471257 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.471374 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.472314 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.472387 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" gracePeriod=600 Oct 06 09:18:46 crc kubenswrapper[4610]: E1006 09:18:46.603802 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.609706 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" exitCode=0 Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.609795 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31"} Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.609969 4610 scope.go:117] "RemoveContainer" containerID="a7252b6aaf2929a0234f9cea134ab6667be1abfa6b1537e1c78a905ebc421a87" Oct 06 09:18:46 crc kubenswrapper[4610]: I1006 09:18:46.610650 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:18:46 crc kubenswrapper[4610]: E1006 09:18:46.610887 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:18:59 crc kubenswrapper[4610]: I1006 09:18:59.086013 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:18:59 crc kubenswrapper[4610]: E1006 09:18:59.087751 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:19:10 crc kubenswrapper[4610]: I1006 09:19:10.070847 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:19:10 crc kubenswrapper[4610]: E1006 09:19:10.073237 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:19:20 crc kubenswrapper[4610]: E1006 09:19:20.201594 4610 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Oct 06 09:19:22 crc kubenswrapper[4610]: I1006 09:19:22.967992 4610 generic.go:334] "Generic (PLEG): container finished" podID="4288043c-e9b4-4c1c-8234-3f44be6fbc2f" containerID="4df47465854a27ea871f038b067ac4c9ae31e3c8f239f83c1cf5bf853ff15947" exitCode=0 Oct 06 09:19:22 crc kubenswrapper[4610]: I1006 09:19:22.968127 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" event={"ID":"4288043c-e9b4-4c1c-8234-3f44be6fbc2f","Type":"ContainerDied","Data":"4df47465854a27ea871f038b067ac4c9ae31e3c8f239f83c1cf5bf853ff15947"} Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.070505 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:19:24 crc kubenswrapper[4610]: E1006 09:19:24.070845 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.431591 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.564854 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.565267 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76wtt\" (UniqueName: \"kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.565567 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.565613 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.565644 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.565838 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.571031 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.573606 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt" (OuterVolumeSpecName: "kube-api-access-76wtt") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "kube-api-access-76wtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.600286 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.606154 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:19:24 crc kubenswrapper[4610]: E1006 09:19:24.622360 4610 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0 podName:4288043c-e9b4-4c1c-8234-3f44be6fbc2f nodeName:}" failed. No retries permitted until 2025-10-06 09:19:25.122330378 +0000 UTC m=+2296.837383776 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "nova-metadata-neutron-config-0" (UniqueName: "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f") : error deleting /var/lib/kubelet/pods/4288043c-e9b4-4c1c-8234-3f44be6fbc2f/volume-subpaths: remove /var/lib/kubelet/pods/4288043c-e9b4-4c1c-8234-3f44be6fbc2f/volume-subpaths: no such file or directory Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.625413 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory" (OuterVolumeSpecName: "inventory") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.668360 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.668413 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76wtt\" (UniqueName: \"kubernetes.io/projected/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-kube-api-access-76wtt\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.668434 4610 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.668455 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.668474 4610 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.993675 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" event={"ID":"4288043c-e9b4-4c1c-8234-3f44be6fbc2f","Type":"ContainerDied","Data":"f43f6928a81b5ac55367691115672c8d0d6765b7b9b0cc44d489a9e705d8a0f3"} Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.993738 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43f6928a81b5ac55367691115672c8d0d6765b7b9b0cc44d489a9e705d8a0f3" Oct 06 09:19:24 crc kubenswrapper[4610]: I1006 09:19:24.993778 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.119752 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth"] Oct 06 09:19:25 crc kubenswrapper[4610]: E1006 09:19:25.120131 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4288043c-e9b4-4c1c-8234-3f44be6fbc2f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.120146 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4288043c-e9b4-4c1c-8234-3f44be6fbc2f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.120332 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4288043c-e9b4-4c1c-8234-3f44be6fbc2f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.120907 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.123864 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.179911 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") pod \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\" (UID: \"4288043c-e9b4-4c1c-8234-3f44be6fbc2f\") " Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.185804 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth"] Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.185829 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4288043c-e9b4-4c1c-8234-3f44be6fbc2f" (UID: "4288043c-e9b4-4c1c-8234-3f44be6fbc2f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.282889 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.283152 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2wtg\" (UniqueName: \"kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.283328 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.283367 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.283412 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.283632 4610 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4288043c-e9b4-4c1c-8234-3f44be6fbc2f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.385630 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.385728 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.385797 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.386117 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.386189 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2wtg\" (UniqueName: \"kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.390008 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.391503 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.391727 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.392900 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.401312 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2wtg\" (UniqueName: \"kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9gsth\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.486469 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:19:25 crc kubenswrapper[4610]: I1006 09:19:25.999375 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth"] Oct 06 09:19:26 crc kubenswrapper[4610]: W1006 09:19:26.004121 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7198dacf_4e83_415a_a302_d543a7c2fea9.slice/crio-a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149 WatchSource:0}: Error finding container a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149: Status 404 returned error can't find the container with id a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149 Oct 06 09:19:27 crc kubenswrapper[4610]: I1006 09:19:27.011376 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" event={"ID":"7198dacf-4e83-415a-a302-d543a7c2fea9","Type":"ContainerStarted","Data":"435a084a238902eaecde014a6cbabaf6e08227e3aa4171bcb3b4a9829f786777"} Oct 06 09:19:27 crc kubenswrapper[4610]: I1006 09:19:27.012489 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" event={"ID":"7198dacf-4e83-415a-a302-d543a7c2fea9","Type":"ContainerStarted","Data":"a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149"} Oct 06 09:19:27 crc kubenswrapper[4610]: I1006 09:19:27.055813 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" podStartSLOduration=1.8634057849999999 podStartE2EDuration="2.055785338s" podCreationTimestamp="2025-10-06 09:19:25 +0000 UTC" firstStartedPulling="2025-10-06 09:19:26.005856214 +0000 UTC m=+2297.720909602" lastFinishedPulling="2025-10-06 09:19:26.198235767 +0000 UTC m=+2297.913289155" observedRunningTime="2025-10-06 09:19:27.038606914 +0000 UTC m=+2298.753660312" watchObservedRunningTime="2025-10-06 09:19:27.055785338 +0000 UTC m=+2298.770838756" Oct 06 09:19:37 crc kubenswrapper[4610]: I1006 09:19:37.070618 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:19:37 crc kubenswrapper[4610]: E1006 09:19:37.071500 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:19:50 crc kubenswrapper[4610]: I1006 09:19:50.070420 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:19:50 crc kubenswrapper[4610]: E1006 09:19:50.071093 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:20:03 crc kubenswrapper[4610]: I1006 09:20:03.070566 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:20:03 crc kubenswrapper[4610]: E1006 09:20:03.071316 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:20:15 crc kubenswrapper[4610]: I1006 09:20:15.071563 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:20:15 crc kubenswrapper[4610]: E1006 09:20:15.072775 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:20:28 crc kubenswrapper[4610]: I1006 09:20:28.070827 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:20:28 crc kubenswrapper[4610]: E1006 09:20:28.071886 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:20:39 crc kubenswrapper[4610]: I1006 09:20:39.081429 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:20:39 crc kubenswrapper[4610]: E1006 09:20:39.082238 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:20:52 crc kubenswrapper[4610]: I1006 09:20:52.070771 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:20:52 crc kubenswrapper[4610]: E1006 09:20:52.071725 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:21:04 crc kubenswrapper[4610]: I1006 09:21:04.070200 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:21:04 crc kubenswrapper[4610]: E1006 09:21:04.070889 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:21:17 crc kubenswrapper[4610]: I1006 09:21:17.070745 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:21:17 crc kubenswrapper[4610]: E1006 09:21:17.071555 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:21:28 crc kubenswrapper[4610]: I1006 09:21:28.070805 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:21:28 crc kubenswrapper[4610]: E1006 09:21:28.072030 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:21:41 crc kubenswrapper[4610]: I1006 09:21:41.071776 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:21:41 crc kubenswrapper[4610]: E1006 09:21:41.072516 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:21:52 crc kubenswrapper[4610]: I1006 09:21:52.070147 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:21:52 crc kubenswrapper[4610]: E1006 09:21:52.072063 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:22:06 crc kubenswrapper[4610]: I1006 09:22:06.070679 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:22:06 crc kubenswrapper[4610]: E1006 09:22:06.071390 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:22:20 crc kubenswrapper[4610]: I1006 09:22:20.071654 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:22:20 crc kubenswrapper[4610]: E1006 09:22:20.072792 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:22:32 crc kubenswrapper[4610]: I1006 09:22:32.070676 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:22:32 crc kubenswrapper[4610]: E1006 09:22:32.071902 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:22:45 crc kubenswrapper[4610]: I1006 09:22:45.071778 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:22:45 crc kubenswrapper[4610]: E1006 09:22:45.072621 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:22:56 crc kubenswrapper[4610]: I1006 09:22:56.071060 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:22:56 crc kubenswrapper[4610]: E1006 09:22:56.071824 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:23:10 crc kubenswrapper[4610]: I1006 09:23:10.071406 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:23:10 crc kubenswrapper[4610]: E1006 09:23:10.072059 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:23:21 crc kubenswrapper[4610]: I1006 09:23:21.070834 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:23:21 crc kubenswrapper[4610]: E1006 09:23:21.071561 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:23:33 crc kubenswrapper[4610]: I1006 09:23:33.070774 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:23:33 crc kubenswrapper[4610]: E1006 09:23:33.071970 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:23:48 crc kubenswrapper[4610]: I1006 09:23:48.070556 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:23:48 crc kubenswrapper[4610]: I1006 09:23:48.566209 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a"} Oct 06 09:24:02 crc kubenswrapper[4610]: I1006 09:24:02.690569 4610 generic.go:334] "Generic (PLEG): container finished" podID="7198dacf-4e83-415a-a302-d543a7c2fea9" containerID="435a084a238902eaecde014a6cbabaf6e08227e3aa4171bcb3b4a9829f786777" exitCode=0 Oct 06 09:24:02 crc kubenswrapper[4610]: I1006 09:24:02.690676 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" event={"ID":"7198dacf-4e83-415a-a302-d543a7c2fea9","Type":"ContainerDied","Data":"435a084a238902eaecde014a6cbabaf6e08227e3aa4171bcb3b4a9829f786777"} Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.153437 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.248430 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle\") pod \"7198dacf-4e83-415a-a302-d543a7c2fea9\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.248471 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory\") pod \"7198dacf-4e83-415a-a302-d543a7c2fea9\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.248497 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key\") pod \"7198dacf-4e83-415a-a302-d543a7c2fea9\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.248582 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0\") pod \"7198dacf-4e83-415a-a302-d543a7c2fea9\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.248654 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2wtg\" (UniqueName: \"kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg\") pod \"7198dacf-4e83-415a-a302-d543a7c2fea9\" (UID: \"7198dacf-4e83-415a-a302-d543a7c2fea9\") " Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.256239 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7198dacf-4e83-415a-a302-d543a7c2fea9" (UID: "7198dacf-4e83-415a-a302-d543a7c2fea9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.258178 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg" (OuterVolumeSpecName: "kube-api-access-f2wtg") pod "7198dacf-4e83-415a-a302-d543a7c2fea9" (UID: "7198dacf-4e83-415a-a302-d543a7c2fea9"). InnerVolumeSpecName "kube-api-access-f2wtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.282033 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7198dacf-4e83-415a-a302-d543a7c2fea9" (UID: "7198dacf-4e83-415a-a302-d543a7c2fea9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.282117 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7198dacf-4e83-415a-a302-d543a7c2fea9" (UID: "7198dacf-4e83-415a-a302-d543a7c2fea9"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.290005 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory" (OuterVolumeSpecName: "inventory") pod "7198dacf-4e83-415a-a302-d543a7c2fea9" (UID: "7198dacf-4e83-415a-a302-d543a7c2fea9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.351478 4610 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.351520 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.351530 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.351539 4610 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7198dacf-4e83-415a-a302-d543a7c2fea9-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.351552 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2wtg\" (UniqueName: \"kubernetes.io/projected/7198dacf-4e83-415a-a302-d543a7c2fea9-kube-api-access-f2wtg\") on node \"crc\" DevicePath \"\"" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.715603 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" event={"ID":"7198dacf-4e83-415a-a302-d543a7c2fea9","Type":"ContainerDied","Data":"a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149"} Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.715848 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bb5b81ab6d49cc83808e06f3cac5df5ea813a7dafa8816f44c0fb3509c9149" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.715861 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9gsth" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.849526 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh"] Oct 06 09:24:04 crc kubenswrapper[4610]: E1006 09:24:04.849954 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7198dacf-4e83-415a-a302-d543a7c2fea9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.849970 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7198dacf-4e83-415a-a302-d543a7c2fea9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.850193 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7198dacf-4e83-415a-a302-d543a7c2fea9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.850837 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.854220 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.854632 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.854856 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.854992 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.855210 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.855337 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.855502 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.868903 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh"] Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.971553 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.971927 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972024 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972156 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972266 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqb5k\" (UniqueName: \"kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972415 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972462 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972496 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:04 crc kubenswrapper[4610]: I1006 09:24:04.972639 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.073839 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.073887 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.073924 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.073975 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqb5k\" (UniqueName: \"kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.074642 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.074742 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.074773 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.074814 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.074867 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.075676 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.079023 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.079456 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.079808 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.080979 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.082024 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.098653 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.099332 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.105491 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqb5k\" (UniqueName: \"kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jxfdh\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.177242 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.711550 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh"] Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.712880 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:24:05 crc kubenswrapper[4610]: I1006 09:24:05.730173 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" event={"ID":"3ba105b9-3b48-4236-a86d-6fcded83393a","Type":"ContainerStarted","Data":"9297843248c008a7b5461f71198d1beb4ddc3494e4a5b20e57b55f285f93910d"} Oct 06 09:24:06 crc kubenswrapper[4610]: I1006 09:24:06.740135 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" event={"ID":"3ba105b9-3b48-4236-a86d-6fcded83393a","Type":"ContainerStarted","Data":"7726683d2f01a8db8fb69f93cee4e44001504c2ac4664a4a42279646ddf928cb"} Oct 06 09:24:06 crc kubenswrapper[4610]: I1006 09:24:06.761229 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" podStartSLOduration=2.553311339 podStartE2EDuration="2.761214128s" podCreationTimestamp="2025-10-06 09:24:04 +0000 UTC" firstStartedPulling="2025-10-06 09:24:05.712618738 +0000 UTC m=+2577.427672136" lastFinishedPulling="2025-10-06 09:24:05.920521537 +0000 UTC m=+2577.635574925" observedRunningTime="2025-10-06 09:24:06.758630921 +0000 UTC m=+2578.473684329" watchObservedRunningTime="2025-10-06 09:24:06.761214128 +0000 UTC m=+2578.476267516" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.608090 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.610236 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.625228 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.769551 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.769895 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.770075 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpm4t\" (UniqueName: \"kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.871877 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.871990 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.872101 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpm4t\" (UniqueName: \"kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.872388 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.872639 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.895973 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpm4t\" (UniqueName: \"kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t\") pod \"redhat-operators-vxcwz\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:57 crc kubenswrapper[4610]: I1006 09:24:57.935125 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:24:58 crc kubenswrapper[4610]: I1006 09:24:58.276455 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:24:58 crc kubenswrapper[4610]: I1006 09:24:58.333152 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerStarted","Data":"03372d247e0597445801b5a294daa6bcf1a20c4c7d4d7ae27a26f497c814b9f6"} Oct 06 09:24:59 crc kubenswrapper[4610]: I1006 09:24:59.343082 4610 generic.go:334] "Generic (PLEG): container finished" podID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerID="0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79" exitCode=0 Oct 06 09:24:59 crc kubenswrapper[4610]: I1006 09:24:59.343377 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerDied","Data":"0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79"} Oct 06 09:25:00 crc kubenswrapper[4610]: I1006 09:25:00.357884 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerStarted","Data":"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d"} Oct 06 09:25:04 crc kubenswrapper[4610]: I1006 09:25:04.404718 4610 generic.go:334] "Generic (PLEG): container finished" podID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerID="dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d" exitCode=0 Oct 06 09:25:04 crc kubenswrapper[4610]: I1006 09:25:04.404810 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerDied","Data":"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d"} Oct 06 09:25:05 crc kubenswrapper[4610]: I1006 09:25:05.421740 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerStarted","Data":"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130"} Oct 06 09:25:05 crc kubenswrapper[4610]: I1006 09:25:05.444497 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vxcwz" podStartSLOduration=2.948561856 podStartE2EDuration="8.444479981s" podCreationTimestamp="2025-10-06 09:24:57 +0000 UTC" firstStartedPulling="2025-10-06 09:24:59.344714566 +0000 UTC m=+2631.059767954" lastFinishedPulling="2025-10-06 09:25:04.840632661 +0000 UTC m=+2636.555686079" observedRunningTime="2025-10-06 09:25:05.439490411 +0000 UTC m=+2637.154543819" watchObservedRunningTime="2025-10-06 09:25:05.444479981 +0000 UTC m=+2637.159533369" Oct 06 09:25:07 crc kubenswrapper[4610]: I1006 09:25:07.937543 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:07 crc kubenswrapper[4610]: I1006 09:25:07.937804 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:09 crc kubenswrapper[4610]: I1006 09:25:09.053387 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vxcwz" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="registry-server" probeResult="failure" output=< Oct 06 09:25:09 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:25:09 crc kubenswrapper[4610]: > Oct 06 09:25:17 crc kubenswrapper[4610]: I1006 09:25:17.986901 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:18 crc kubenswrapper[4610]: I1006 09:25:18.047860 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:18 crc kubenswrapper[4610]: I1006 09:25:18.235530 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:25:19 crc kubenswrapper[4610]: I1006 09:25:19.547271 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vxcwz" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="registry-server" containerID="cri-o://2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130" gracePeriod=2 Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.508661 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.591761 4610 generic.go:334] "Generic (PLEG): container finished" podID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerID="2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130" exitCode=0 Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.591813 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerDied","Data":"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130"} Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.591844 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxcwz" event={"ID":"65e01d06-ad18-4579-97ef-5a5c258c9c24","Type":"ContainerDied","Data":"03372d247e0597445801b5a294daa6bcf1a20c4c7d4d7ae27a26f497c814b9f6"} Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.591864 4610 scope.go:117] "RemoveContainer" containerID="2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.592849 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxcwz" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.598214 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpm4t\" (UniqueName: \"kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t\") pod \"65e01d06-ad18-4579-97ef-5a5c258c9c24\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.598472 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities\") pod \"65e01d06-ad18-4579-97ef-5a5c258c9c24\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.598630 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content\") pod \"65e01d06-ad18-4579-97ef-5a5c258c9c24\" (UID: \"65e01d06-ad18-4579-97ef-5a5c258c9c24\") " Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.600433 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities" (OuterVolumeSpecName: "utilities") pod "65e01d06-ad18-4579-97ef-5a5c258c9c24" (UID: "65e01d06-ad18-4579-97ef-5a5c258c9c24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.623013 4610 scope.go:117] "RemoveContainer" containerID="dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.623120 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t" (OuterVolumeSpecName: "kube-api-access-kpm4t") pod "65e01d06-ad18-4579-97ef-5a5c258c9c24" (UID: "65e01d06-ad18-4579-97ef-5a5c258c9c24"). InnerVolumeSpecName "kube-api-access-kpm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.666367 4610 scope.go:117] "RemoveContainer" containerID="0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.695768 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65e01d06-ad18-4579-97ef-5a5c258c9c24" (UID: "65e01d06-ad18-4579-97ef-5a5c258c9c24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.701235 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpm4t\" (UniqueName: \"kubernetes.io/projected/65e01d06-ad18-4579-97ef-5a5c258c9c24-kube-api-access-kpm4t\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.701261 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.701272 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e01d06-ad18-4579-97ef-5a5c258c9c24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.715301 4610 scope.go:117] "RemoveContainer" containerID="2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130" Oct 06 09:25:20 crc kubenswrapper[4610]: E1006 09:25:20.715987 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130\": container with ID starting with 2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130 not found: ID does not exist" containerID="2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.716029 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130"} err="failed to get container status \"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130\": rpc error: code = NotFound desc = could not find container \"2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130\": container with ID starting with 2da33c2d775bcc953f093e09c7f3556c0307cacff1d5a508a1abf8d34ae07130 not found: ID does not exist" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.716070 4610 scope.go:117] "RemoveContainer" containerID="dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d" Oct 06 09:25:20 crc kubenswrapper[4610]: E1006 09:25:20.717286 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d\": container with ID starting with dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d not found: ID does not exist" containerID="dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.717369 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d"} err="failed to get container status \"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d\": rpc error: code = NotFound desc = could not find container \"dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d\": container with ID starting with dd49e95071caf1e21546a18741b67b601ba8c47efa7625202786fc3d9104767d not found: ID does not exist" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.717418 4610 scope.go:117] "RemoveContainer" containerID="0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79" Oct 06 09:25:20 crc kubenswrapper[4610]: E1006 09:25:20.717852 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79\": container with ID starting with 0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79 not found: ID does not exist" containerID="0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.717888 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79"} err="failed to get container status \"0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79\": rpc error: code = NotFound desc = could not find container \"0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79\": container with ID starting with 0567a10b6d2f5aa6850531c26a39fc58ec7be0787cf0513f6951794148293f79 not found: ID does not exist" Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.926942 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:25:20 crc kubenswrapper[4610]: I1006 09:25:20.936065 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vxcwz"] Oct 06 09:25:21 crc kubenswrapper[4610]: I1006 09:25:21.082630 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" path="/var/lib/kubelet/pods/65e01d06-ad18-4579-97ef-5a5c258c9c24/volumes" Oct 06 09:26:16 crc kubenswrapper[4610]: I1006 09:26:16.469134 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:26:16 crc kubenswrapper[4610]: I1006 09:26:16.469736 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.864536 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:38 crc kubenswrapper[4610]: E1006 09:26:38.865531 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="extract-utilities" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.865553 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="extract-utilities" Oct 06 09:26:38 crc kubenswrapper[4610]: E1006 09:26:38.865607 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="extract-content" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.865617 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="extract-content" Oct 06 09:26:38 crc kubenswrapper[4610]: E1006 09:26:38.865636 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="registry-server" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.865644 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="registry-server" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.865843 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e01d06-ad18-4579-97ef-5a5c258c9c24" containerName="registry-server" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.867509 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:38 crc kubenswrapper[4610]: I1006 09:26:38.886946 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.037604 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.037943 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48q5b\" (UniqueName: \"kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.038130 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.140256 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.140371 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.140397 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48q5b\" (UniqueName: \"kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.140909 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.140962 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.158889 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48q5b\" (UniqueName: \"kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b\") pod \"redhat-marketplace-m8lzq\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.201279 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:39 crc kubenswrapper[4610]: I1006 09:26:39.653758 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:40 crc kubenswrapper[4610]: I1006 09:26:40.485298 4610 generic.go:334] "Generic (PLEG): container finished" podID="809579d1-4b54-4412-ae89-5976295dfd01" containerID="12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0" exitCode=0 Oct 06 09:26:40 crc kubenswrapper[4610]: I1006 09:26:40.485905 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerDied","Data":"12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0"} Oct 06 09:26:40 crc kubenswrapper[4610]: I1006 09:26:40.485939 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerStarted","Data":"00fb9904d1c62447ee1467a296ec93210fe674e4c010875d832a60bb9db1b9af"} Oct 06 09:26:41 crc kubenswrapper[4610]: I1006 09:26:41.496073 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerStarted","Data":"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0"} Oct 06 09:26:42 crc kubenswrapper[4610]: I1006 09:26:42.509701 4610 generic.go:334] "Generic (PLEG): container finished" podID="809579d1-4b54-4412-ae89-5976295dfd01" containerID="fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0" exitCode=0 Oct 06 09:26:42 crc kubenswrapper[4610]: I1006 09:26:42.509825 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerDied","Data":"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0"} Oct 06 09:26:43 crc kubenswrapper[4610]: I1006 09:26:43.520094 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerStarted","Data":"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682"} Oct 06 09:26:43 crc kubenswrapper[4610]: I1006 09:26:43.542353 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8lzq" podStartSLOduration=3.140080841 podStartE2EDuration="5.542333948s" podCreationTimestamp="2025-10-06 09:26:38 +0000 UTC" firstStartedPulling="2025-10-06 09:26:40.489588617 +0000 UTC m=+2732.204642005" lastFinishedPulling="2025-10-06 09:26:42.891841704 +0000 UTC m=+2734.606895112" observedRunningTime="2025-10-06 09:26:43.542300777 +0000 UTC m=+2735.257354185" watchObservedRunningTime="2025-10-06 09:26:43.542333948 +0000 UTC m=+2735.257387336" Oct 06 09:26:46 crc kubenswrapper[4610]: I1006 09:26:46.469842 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:26:46 crc kubenswrapper[4610]: I1006 09:26:46.470382 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:26:49 crc kubenswrapper[4610]: I1006 09:26:49.202796 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:49 crc kubenswrapper[4610]: I1006 09:26:49.203276 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:49 crc kubenswrapper[4610]: I1006 09:26:49.253898 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:49 crc kubenswrapper[4610]: I1006 09:26:49.631578 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:49 crc kubenswrapper[4610]: I1006 09:26:49.699023 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:51 crc kubenswrapper[4610]: I1006 09:26:51.594635 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m8lzq" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="registry-server" containerID="cri-o://c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682" gracePeriod=2 Oct 06 09:26:51 crc kubenswrapper[4610]: E1006 09:26:51.817228 4610 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809579d1_4b54_4412_ae89_5976295dfd01.slice/crio-conmon-c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809579d1_4b54_4412_ae89_5976295dfd01.slice/crio-c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.047437 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.202645 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities\") pod \"809579d1-4b54-4412-ae89-5976295dfd01\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.202832 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content\") pod \"809579d1-4b54-4412-ae89-5976295dfd01\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.202876 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48q5b\" (UniqueName: \"kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b\") pod \"809579d1-4b54-4412-ae89-5976295dfd01\" (UID: \"809579d1-4b54-4412-ae89-5976295dfd01\") " Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.203484 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities" (OuterVolumeSpecName: "utilities") pod "809579d1-4b54-4412-ae89-5976295dfd01" (UID: "809579d1-4b54-4412-ae89-5976295dfd01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.210738 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b" (OuterVolumeSpecName: "kube-api-access-48q5b") pod "809579d1-4b54-4412-ae89-5976295dfd01" (UID: "809579d1-4b54-4412-ae89-5976295dfd01"). InnerVolumeSpecName "kube-api-access-48q5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.219238 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "809579d1-4b54-4412-ae89-5976295dfd01" (UID: "809579d1-4b54-4412-ae89-5976295dfd01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.305442 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.305472 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48q5b\" (UniqueName: \"kubernetes.io/projected/809579d1-4b54-4412-ae89-5976295dfd01-kube-api-access-48q5b\") on node \"crc\" DevicePath \"\"" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.305484 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/809579d1-4b54-4412-ae89-5976295dfd01-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.603531 4610 generic.go:334] "Generic (PLEG): container finished" podID="809579d1-4b54-4412-ae89-5976295dfd01" containerID="c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682" exitCode=0 Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.603571 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8lzq" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.603580 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerDied","Data":"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682"} Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.603607 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8lzq" event={"ID":"809579d1-4b54-4412-ae89-5976295dfd01","Type":"ContainerDied","Data":"00fb9904d1c62447ee1467a296ec93210fe674e4c010875d832a60bb9db1b9af"} Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.603625 4610 scope.go:117] "RemoveContainer" containerID="c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.630805 4610 scope.go:117] "RemoveContainer" containerID="fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.656008 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.665243 4610 scope.go:117] "RemoveContainer" containerID="12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.671082 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8lzq"] Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.696378 4610 scope.go:117] "RemoveContainer" containerID="c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682" Oct 06 09:26:52 crc kubenswrapper[4610]: E1006 09:26:52.697669 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682\": container with ID starting with c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682 not found: ID does not exist" containerID="c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.697717 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682"} err="failed to get container status \"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682\": rpc error: code = NotFound desc = could not find container \"c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682\": container with ID starting with c158b8595e3763d228e651f6689ba909c6ad0b12b5892d80dedeb95040ddd682 not found: ID does not exist" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.697744 4610 scope.go:117] "RemoveContainer" containerID="fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0" Oct 06 09:26:52 crc kubenswrapper[4610]: E1006 09:26:52.698278 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0\": container with ID starting with fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0 not found: ID does not exist" containerID="fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.698310 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0"} err="failed to get container status \"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0\": rpc error: code = NotFound desc = could not find container \"fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0\": container with ID starting with fb53ad2882d416c7631ce3a786a55585c4b7531169d63905b798f47ebb0cb3b0 not found: ID does not exist" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.698331 4610 scope.go:117] "RemoveContainer" containerID="12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0" Oct 06 09:26:52 crc kubenswrapper[4610]: E1006 09:26:52.698576 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0\": container with ID starting with 12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0 not found: ID does not exist" containerID="12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0" Oct 06 09:26:52 crc kubenswrapper[4610]: I1006 09:26:52.698605 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0"} err="failed to get container status \"12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0\": rpc error: code = NotFound desc = could not find container \"12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0\": container with ID starting with 12127a7893411bc2f61b8714929ef478f47e759f0bcb4af23041d8a22c6d84c0 not found: ID does not exist" Oct 06 09:26:53 crc kubenswrapper[4610]: I1006 09:26:53.092637 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809579d1-4b54-4412-ae89-5976295dfd01" path="/var/lib/kubelet/pods/809579d1-4b54-4412-ae89-5976295dfd01/volumes" Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.468607 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.469077 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.469114 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.469615 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.469659 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a" gracePeriod=600 Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.862053 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a" exitCode=0 Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.862081 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a"} Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.862354 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d"} Oct 06 09:27:16 crc kubenswrapper[4610]: I1006 09:27:16.862381 4610 scope.go:117] "RemoveContainer" containerID="5a313681b8cc6ca90d81178312cbd85a96499ab5071a5f5eb4125f89bce1ab31" Oct 06 09:27:50 crc kubenswrapper[4610]: I1006 09:27:50.207297 4610 generic.go:334] "Generic (PLEG): container finished" podID="3ba105b9-3b48-4236-a86d-6fcded83393a" containerID="7726683d2f01a8db8fb69f93cee4e44001504c2ac4664a4a42279646ddf928cb" exitCode=0 Oct 06 09:27:50 crc kubenswrapper[4610]: I1006 09:27:50.207784 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" event={"ID":"3ba105b9-3b48-4236-a86d-6fcded83393a","Type":"ContainerDied","Data":"7726683d2f01a8db8fb69f93cee4e44001504c2ac4664a4a42279646ddf928cb"} Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.692857 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798169 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798208 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798275 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798337 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798374 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798401 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqb5k\" (UniqueName: \"kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798435 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798563 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.798616 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory\") pod \"3ba105b9-3b48-4236-a86d-6fcded83393a\" (UID: \"3ba105b9-3b48-4236-a86d-6fcded83393a\") " Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.804208 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k" (OuterVolumeSpecName: "kube-api-access-nqb5k") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "kube-api-access-nqb5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.809566 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.831275 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.836259 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.844288 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.846545 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory" (OuterVolumeSpecName: "inventory") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.849220 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.863531 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.866456 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3ba105b9-3b48-4236-a86d-6fcded83393a" (UID: "3ba105b9-3b48-4236-a86d-6fcded83393a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901160 4610 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901198 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901212 4610 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901224 4610 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901234 4610 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901248 4610 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901260 4610 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901269 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqb5k\" (UniqueName: \"kubernetes.io/projected/3ba105b9-3b48-4236-a86d-6fcded83393a-kube-api-access-nqb5k\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:51 crc kubenswrapper[4610]: I1006 09:27:51.901280 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ba105b9-3b48-4236-a86d-6fcded83393a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.233472 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" event={"ID":"3ba105b9-3b48-4236-a86d-6fcded83393a","Type":"ContainerDied","Data":"9297843248c008a7b5461f71198d1beb4ddc3494e4a5b20e57b55f285f93910d"} Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.233511 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9297843248c008a7b5461f71198d1beb4ddc3494e4a5b20e57b55f285f93910d" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.233529 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jxfdh" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.400625 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd"] Oct 06 09:27:52 crc kubenswrapper[4610]: E1006 09:27:52.401021 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="registry-server" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401054 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="registry-server" Oct 06 09:27:52 crc kubenswrapper[4610]: E1006 09:27:52.401073 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba105b9-3b48-4236-a86d-6fcded83393a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401081 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba105b9-3b48-4236-a86d-6fcded83393a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 09:27:52 crc kubenswrapper[4610]: E1006 09:27:52.401112 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="extract-utilities" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401123 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="extract-utilities" Oct 06 09:27:52 crc kubenswrapper[4610]: E1006 09:27:52.401153 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="extract-content" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401160 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="extract-content" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401362 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba105b9-3b48-4236-a86d-6fcded83393a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.401408 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="809579d1-4b54-4412-ae89-5976295dfd01" containerName="registry-server" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.402123 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.406679 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.406854 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.406975 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7f7g5" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.407148 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.410253 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.412298 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd"] Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511409 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511462 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511483 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k574l\" (UniqueName: \"kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511572 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511601 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511616 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.511858 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613516 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613582 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613786 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613889 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613951 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.613982 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k574l\" (UniqueName: \"kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.614267 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.620667 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.620695 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.623532 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.623978 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.624429 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.627951 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.650973 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k574l\" (UniqueName: \"kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:52 crc kubenswrapper[4610]: I1006 09:27:52.722468 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:27:53 crc kubenswrapper[4610]: I1006 09:27:53.241363 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd"] Oct 06 09:27:54 crc kubenswrapper[4610]: I1006 09:27:54.254287 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" event={"ID":"a11ef1e8-ba4f-4b82-adad-cbe054665d4c","Type":"ContainerStarted","Data":"5fc0aa6f76b635f5a943e710cc308a024b957433aee7e4b65c401d3db6345cc4"} Oct 06 09:27:54 crc kubenswrapper[4610]: I1006 09:27:54.254626 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" event={"ID":"a11ef1e8-ba4f-4b82-adad-cbe054665d4c","Type":"ContainerStarted","Data":"97d8ba01003b2b297710e7a33478fd28687802b38ad101980d38d0776fc9e89c"} Oct 06 09:27:54 crc kubenswrapper[4610]: I1006 09:27:54.287188 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" podStartSLOduration=2.114158537 podStartE2EDuration="2.28716526s" podCreationTimestamp="2025-10-06 09:27:52 +0000 UTC" firstStartedPulling="2025-10-06 09:27:53.241179223 +0000 UTC m=+2804.956232631" lastFinishedPulling="2025-10-06 09:27:53.414185966 +0000 UTC m=+2805.129239354" observedRunningTime="2025-10-06 09:27:54.283856486 +0000 UTC m=+2805.998909874" watchObservedRunningTime="2025-10-06 09:27:54.28716526 +0000 UTC m=+2806.002218668" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.026791 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.029760 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.047141 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.090081 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.090218 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.090248 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2w7\" (UniqueName: \"kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.192259 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.192347 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2w7\" (UniqueName: \"kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.192459 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.192846 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.192960 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.216425 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2w7\" (UniqueName: \"kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7\") pod \"community-operators-g92rz\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.352231 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:06 crc kubenswrapper[4610]: I1006 09:28:06.941439 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:07 crc kubenswrapper[4610]: I1006 09:28:07.362436 4610 generic.go:334] "Generic (PLEG): container finished" podID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerID="2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a" exitCode=0 Oct 06 09:28:07 crc kubenswrapper[4610]: I1006 09:28:07.362512 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerDied","Data":"2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a"} Oct 06 09:28:07 crc kubenswrapper[4610]: I1006 09:28:07.362713 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerStarted","Data":"7071babf940ac660825e494d2871174757b01fc86e54e87762951e626782abdc"} Oct 06 09:28:08 crc kubenswrapper[4610]: I1006 09:28:08.374870 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerStarted","Data":"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba"} Oct 06 09:28:09 crc kubenswrapper[4610]: I1006 09:28:09.385231 4610 generic.go:334] "Generic (PLEG): container finished" podID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerID="bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba" exitCode=0 Oct 06 09:28:09 crc kubenswrapper[4610]: I1006 09:28:09.385292 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerDied","Data":"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba"} Oct 06 09:28:10 crc kubenswrapper[4610]: I1006 09:28:10.395459 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerStarted","Data":"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3"} Oct 06 09:28:10 crc kubenswrapper[4610]: I1006 09:28:10.427304 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g92rz" podStartSLOduration=1.7654414950000001 podStartE2EDuration="4.427282538s" podCreationTimestamp="2025-10-06 09:28:06 +0000 UTC" firstStartedPulling="2025-10-06 09:28:07.365000677 +0000 UTC m=+2819.080054065" lastFinishedPulling="2025-10-06 09:28:10.02684171 +0000 UTC m=+2821.741895108" observedRunningTime="2025-10-06 09:28:10.42380213 +0000 UTC m=+2822.138855518" watchObservedRunningTime="2025-10-06 09:28:10.427282538 +0000 UTC m=+2822.142335926" Oct 06 09:28:16 crc kubenswrapper[4610]: I1006 09:28:16.352590 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:16 crc kubenswrapper[4610]: I1006 09:28:16.353263 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:16 crc kubenswrapper[4610]: I1006 09:28:16.402988 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:16 crc kubenswrapper[4610]: I1006 09:28:16.494872 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:16 crc kubenswrapper[4610]: I1006 09:28:16.643064 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:18 crc kubenswrapper[4610]: I1006 09:28:18.468647 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g92rz" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="registry-server" containerID="cri-o://8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3" gracePeriod=2 Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.465583 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.482439 4610 generic.go:334] "Generic (PLEG): container finished" podID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerID="8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3" exitCode=0 Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.482485 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerDied","Data":"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3"} Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.482511 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g92rz" event={"ID":"69cb6772-1eb6-41a6-be06-a43afe52f96a","Type":"ContainerDied","Data":"7071babf940ac660825e494d2871174757b01fc86e54e87762951e626782abdc"} Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.482524 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g92rz" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.482566 4610 scope.go:117] "RemoveContainer" containerID="8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.518805 4610 scope.go:117] "RemoveContainer" containerID="bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.553611 4610 scope.go:117] "RemoveContainer" containerID="2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.556494 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities\") pod \"69cb6772-1eb6-41a6-be06-a43afe52f96a\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.556590 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2w7\" (UniqueName: \"kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7\") pod \"69cb6772-1eb6-41a6-be06-a43afe52f96a\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.556901 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content\") pod \"69cb6772-1eb6-41a6-be06-a43afe52f96a\" (UID: \"69cb6772-1eb6-41a6-be06-a43afe52f96a\") " Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.557968 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities" (OuterVolumeSpecName: "utilities") pod "69cb6772-1eb6-41a6-be06-a43afe52f96a" (UID: "69cb6772-1eb6-41a6-be06-a43afe52f96a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.564395 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7" (OuterVolumeSpecName: "kube-api-access-7b2w7") pod "69cb6772-1eb6-41a6-be06-a43afe52f96a" (UID: "69cb6772-1eb6-41a6-be06-a43afe52f96a"). InnerVolumeSpecName "kube-api-access-7b2w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.602358 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69cb6772-1eb6-41a6-be06-a43afe52f96a" (UID: "69cb6772-1eb6-41a6-be06-a43afe52f96a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.663972 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.664022 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cb6772-1eb6-41a6-be06-a43afe52f96a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.664036 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b2w7\" (UniqueName: \"kubernetes.io/projected/69cb6772-1eb6-41a6-be06-a43afe52f96a-kube-api-access-7b2w7\") on node \"crc\" DevicePath \"\"" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.687732 4610 scope.go:117] "RemoveContainer" containerID="8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3" Oct 06 09:28:19 crc kubenswrapper[4610]: E1006 09:28:19.688404 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3\": container with ID starting with 8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3 not found: ID does not exist" containerID="8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.688439 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3"} err="failed to get container status \"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3\": rpc error: code = NotFound desc = could not find container \"8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3\": container with ID starting with 8958efd61c9164c4920ca0ef68f31d0ddc6deb988692c98216c9bac38de06bc3 not found: ID does not exist" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.688459 4610 scope.go:117] "RemoveContainer" containerID="bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba" Oct 06 09:28:19 crc kubenswrapper[4610]: E1006 09:28:19.688977 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba\": container with ID starting with bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba not found: ID does not exist" containerID="bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.689000 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba"} err="failed to get container status \"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba\": rpc error: code = NotFound desc = could not find container \"bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba\": container with ID starting with bbb5a1426b31aa47670d16ce7f951a5679fe31c07e2865bea5a4f850950fc9ba not found: ID does not exist" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.689012 4610 scope.go:117] "RemoveContainer" containerID="2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a" Oct 06 09:28:19 crc kubenswrapper[4610]: E1006 09:28:19.689843 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a\": container with ID starting with 2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a not found: ID does not exist" containerID="2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.689920 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a"} err="failed to get container status \"2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a\": rpc error: code = NotFound desc = could not find container \"2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a\": container with ID starting with 2940671a5becffd1a12ceaadcf9b1779562ec33c41b43de5843e03bcb6cc0a0a not found: ID does not exist" Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.818497 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:19 crc kubenswrapper[4610]: I1006 09:28:19.828453 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g92rz"] Oct 06 09:28:21 crc kubenswrapper[4610]: I1006 09:28:21.083158 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" path="/var/lib/kubelet/pods/69cb6772-1eb6-41a6-be06-a43afe52f96a/volumes" Oct 06 09:29:16 crc kubenswrapper[4610]: I1006 09:29:16.468676 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:29:16 crc kubenswrapper[4610]: I1006 09:29:16.469165 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:29:46 crc kubenswrapper[4610]: I1006 09:29:46.468989 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:29:46 crc kubenswrapper[4610]: I1006 09:29:46.469673 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.206649 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56"] Oct 06 09:30:00 crc kubenswrapper[4610]: E1006 09:30:00.208813 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="extract-utilities" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.208889 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="extract-utilities" Oct 06 09:30:00 crc kubenswrapper[4610]: E1006 09:30:00.208956 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.209009 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4610]: E1006 09:30:00.209099 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="extract-content" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.209157 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="extract-content" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.209425 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cb6772-1eb6-41a6-be06-a43afe52f96a" containerName="registry-server" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.210478 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.215931 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.216330 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.242315 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56"] Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.341164 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8plq\" (UniqueName: \"kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.341291 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.341449 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.442534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.442641 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.442721 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8plq\" (UniqueName: \"kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.444354 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.474833 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.480227 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8plq\" (UniqueName: \"kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq\") pod \"collect-profiles-29329050-drq56\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.540496 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:00 crc kubenswrapper[4610]: I1006 09:30:00.991374 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56"] Oct 06 09:30:01 crc kubenswrapper[4610]: I1006 09:30:01.493773 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" event={"ID":"c99ad092-7458-43c4-8848-19dd87947fef","Type":"ContainerStarted","Data":"11874c4fdd17966fee41a5826f0add624f484ea6d944074d69f81593e0209261"} Oct 06 09:30:01 crc kubenswrapper[4610]: I1006 09:30:01.494132 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" event={"ID":"c99ad092-7458-43c4-8848-19dd87947fef","Type":"ContainerStarted","Data":"9a1adad72f7dbf93c6ec052b9fe0f502ee0601a86b0e70a53f89f357f910d4b6"} Oct 06 09:30:01 crc kubenswrapper[4610]: I1006 09:30:01.523219 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" podStartSLOduration=1.5232011559999998 podStartE2EDuration="1.523201156s" podCreationTimestamp="2025-10-06 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 09:30:01.514685121 +0000 UTC m=+2933.229738509" watchObservedRunningTime="2025-10-06 09:30:01.523201156 +0000 UTC m=+2933.238254554" Oct 06 09:30:02 crc kubenswrapper[4610]: I1006 09:30:02.502800 4610 generic.go:334] "Generic (PLEG): container finished" podID="c99ad092-7458-43c4-8848-19dd87947fef" containerID="11874c4fdd17966fee41a5826f0add624f484ea6d944074d69f81593e0209261" exitCode=0 Oct 06 09:30:02 crc kubenswrapper[4610]: I1006 09:30:02.503100 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" event={"ID":"c99ad092-7458-43c4-8848-19dd87947fef","Type":"ContainerDied","Data":"11874c4fdd17966fee41a5826f0add624f484ea6d944074d69f81593e0209261"} Oct 06 09:30:03 crc kubenswrapper[4610]: I1006 09:30:03.850083 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.015886 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8plq\" (UniqueName: \"kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq\") pod \"c99ad092-7458-43c4-8848-19dd87947fef\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.015948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume\") pod \"c99ad092-7458-43c4-8848-19dd87947fef\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.016041 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume\") pod \"c99ad092-7458-43c4-8848-19dd87947fef\" (UID: \"c99ad092-7458-43c4-8848-19dd87947fef\") " Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.016989 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume" (OuterVolumeSpecName: "config-volume") pod "c99ad092-7458-43c4-8848-19dd87947fef" (UID: "c99ad092-7458-43c4-8848-19dd87947fef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.027306 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq" (OuterVolumeSpecName: "kube-api-access-r8plq") pod "c99ad092-7458-43c4-8848-19dd87947fef" (UID: "c99ad092-7458-43c4-8848-19dd87947fef"). InnerVolumeSpecName "kube-api-access-r8plq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.036787 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c99ad092-7458-43c4-8848-19dd87947fef" (UID: "c99ad092-7458-43c4-8848-19dd87947fef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.118483 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8plq\" (UniqueName: \"kubernetes.io/projected/c99ad092-7458-43c4-8848-19dd87947fef-kube-api-access-r8plq\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.118511 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c99ad092-7458-43c4-8848-19dd87947fef-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.118523 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c99ad092-7458-43c4-8848-19dd87947fef-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.522442 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" event={"ID":"c99ad092-7458-43c4-8848-19dd87947fef","Type":"ContainerDied","Data":"9a1adad72f7dbf93c6ec052b9fe0f502ee0601a86b0e70a53f89f357f910d4b6"} Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.522944 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1adad72f7dbf93c6ec052b9fe0f502ee0601a86b0e70a53f89f357f910d4b6" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.522507 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56" Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.580136 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4"] Oct 06 09:30:04 crc kubenswrapper[4610]: I1006 09:30:04.592575 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-ccvc4"] Oct 06 09:30:05 crc kubenswrapper[4610]: I1006 09:30:05.087926 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000cae55-4bf2-468e-8cb6-f056257b1f2e" path="/var/lib/kubelet/pods/000cae55-4bf2-468e-8cb6-f056257b1f2e/volumes" Oct 06 09:30:15 crc kubenswrapper[4610]: I1006 09:30:15.159741 4610 scope.go:117] "RemoveContainer" containerID="2a090a972808e399c9920e3f5be13003a440f02998819283ebe97ac6b56a90e7" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.469618 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.470304 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.470378 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.471276 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.471347 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" gracePeriod=600 Oct 06 09:30:16 crc kubenswrapper[4610]: E1006 09:30:16.601150 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.637513 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" exitCode=0 Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.637578 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d"} Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.637614 4610 scope.go:117] "RemoveContainer" containerID="e6f96bf18662270aeb2cd015583e405a7ccb309c85ec64e719f99df2d9fc0c9a" Oct 06 09:30:16 crc kubenswrapper[4610]: I1006 09:30:16.638320 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:30:16 crc kubenswrapper[4610]: E1006 09:30:16.638733 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:30:28 crc kubenswrapper[4610]: I1006 09:30:28.070796 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:30:28 crc kubenswrapper[4610]: E1006 09:30:28.071675 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:30:43 crc kubenswrapper[4610]: I1006 09:30:43.070595 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:30:43 crc kubenswrapper[4610]: E1006 09:30:43.071460 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:30:56 crc kubenswrapper[4610]: I1006 09:30:56.071723 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:30:56 crc kubenswrapper[4610]: E1006 09:30:56.072878 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:31:11 crc kubenswrapper[4610]: I1006 09:31:11.071326 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:31:11 crc kubenswrapper[4610]: E1006 09:31:11.072110 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:31:26 crc kubenswrapper[4610]: I1006 09:31:26.070131 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:31:26 crc kubenswrapper[4610]: E1006 09:31:26.070841 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:31:26 crc kubenswrapper[4610]: I1006 09:31:26.309010 4610 generic.go:334] "Generic (PLEG): container finished" podID="a11ef1e8-ba4f-4b82-adad-cbe054665d4c" containerID="5fc0aa6f76b635f5a943e710cc308a024b957433aee7e4b65c401d3db6345cc4" exitCode=0 Oct 06 09:31:26 crc kubenswrapper[4610]: I1006 09:31:26.309092 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" event={"ID":"a11ef1e8-ba4f-4b82-adad-cbe054665d4c","Type":"ContainerDied","Data":"5fc0aa6f76b635f5a943e710cc308a024b957433aee7e4b65c401d3db6345cc4"} Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.719153 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.837811 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.837998 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.838036 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.838131 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.838155 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k574l\" (UniqueName: \"kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.838185 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.838222 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2\") pod \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\" (UID: \"a11ef1e8-ba4f-4b82-adad-cbe054665d4c\") " Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.855670 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l" (OuterVolumeSpecName: "kube-api-access-k574l") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "kube-api-access-k574l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.855689 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.869620 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.872332 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.874689 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory" (OuterVolumeSpecName: "inventory") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.884221 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.896488 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a11ef1e8-ba4f-4b82-adad-cbe054665d4c" (UID: "a11ef1e8-ba4f-4b82-adad-cbe054665d4c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.941953 4610 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942203 4610 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942344 4610 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942470 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k574l\" (UniqueName: \"kubernetes.io/projected/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-kube-api-access-k574l\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942627 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942731 4610 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:27 crc kubenswrapper[4610]: I1006 09:31:27.942862 4610 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11ef1e8-ba4f-4b82-adad-cbe054665d4c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:28 crc kubenswrapper[4610]: I1006 09:31:28.332659 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" event={"ID":"a11ef1e8-ba4f-4b82-adad-cbe054665d4c","Type":"ContainerDied","Data":"97d8ba01003b2b297710e7a33478fd28687802b38ad101980d38d0776fc9e89c"} Oct 06 09:31:28 crc kubenswrapper[4610]: I1006 09:31:28.332703 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97d8ba01003b2b297710e7a33478fd28687802b38ad101980d38d0776fc9e89c" Oct 06 09:31:28 crc kubenswrapper[4610]: I1006 09:31:28.332768 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd" Oct 06 09:31:38 crc kubenswrapper[4610]: I1006 09:31:38.070322 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:31:38 crc kubenswrapper[4610]: E1006 09:31:38.072349 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.228779 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:42 crc kubenswrapper[4610]: E1006 09:31:42.230102 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11ef1e8-ba4f-4b82-adad-cbe054665d4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.230131 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11ef1e8-ba4f-4b82-adad-cbe054665d4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 09:31:42 crc kubenswrapper[4610]: E1006 09:31:42.230153 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99ad092-7458-43c4-8848-19dd87947fef" containerName="collect-profiles" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.230167 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99ad092-7458-43c4-8848-19dd87947fef" containerName="collect-profiles" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.230520 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11ef1e8-ba4f-4b82-adad-cbe054665d4c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.230566 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99ad092-7458-43c4-8848-19dd87947fef" containerName="collect-profiles" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.233210 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.260800 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.260876 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.261102 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpjnv\" (UniqueName: \"kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.274416 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.362891 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.362955 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.363113 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpjnv\" (UniqueName: \"kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.364118 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.364461 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.382912 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpjnv\" (UniqueName: \"kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv\") pod \"certified-operators-7b957\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:42 crc kubenswrapper[4610]: I1006 09:31:42.570431 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:43 crc kubenswrapper[4610]: I1006 09:31:43.100763 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:43 crc kubenswrapper[4610]: I1006 09:31:43.466013 4610 generic.go:334] "Generic (PLEG): container finished" podID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerID="7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6" exitCode=0 Oct 06 09:31:43 crc kubenswrapper[4610]: I1006 09:31:43.466086 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerDied","Data":"7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6"} Oct 06 09:31:43 crc kubenswrapper[4610]: I1006 09:31:43.466123 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerStarted","Data":"5a2a781dafe6ba910ea97a3a15936470db8401e70624e54a27dc6bc268510daf"} Oct 06 09:31:43 crc kubenswrapper[4610]: I1006 09:31:43.469139 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:31:44 crc kubenswrapper[4610]: E1006 09:31:44.811554 4610 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.21:58916->38.129.56.21:42153: write tcp 38.129.56.21:58916->38.129.56.21:42153: write: broken pipe Oct 06 09:31:44 crc kubenswrapper[4610]: E1006 09:31:44.811564 4610 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.21:58916->38.129.56.21:42153: read tcp 38.129.56.21:58916->38.129.56.21:42153: read: connection reset by peer Oct 06 09:31:45 crc kubenswrapper[4610]: I1006 09:31:45.492220 4610 generic.go:334] "Generic (PLEG): container finished" podID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerID="ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0" exitCode=0 Oct 06 09:31:45 crc kubenswrapper[4610]: I1006 09:31:45.492400 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerDied","Data":"ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0"} Oct 06 09:31:46 crc kubenswrapper[4610]: I1006 09:31:46.503333 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerStarted","Data":"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a"} Oct 06 09:31:46 crc kubenswrapper[4610]: I1006 09:31:46.530971 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7b957" podStartSLOduration=2.023702279 podStartE2EDuration="4.530956377s" podCreationTimestamp="2025-10-06 09:31:42 +0000 UTC" firstStartedPulling="2025-10-06 09:31:43.468629696 +0000 UTC m=+3035.183683124" lastFinishedPulling="2025-10-06 09:31:45.975883794 +0000 UTC m=+3037.690937222" observedRunningTime="2025-10-06 09:31:46.529843408 +0000 UTC m=+3038.244896816" watchObservedRunningTime="2025-10-06 09:31:46.530956377 +0000 UTC m=+3038.246009765" Oct 06 09:31:52 crc kubenswrapper[4610]: I1006 09:31:52.071562 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:31:52 crc kubenswrapper[4610]: E1006 09:31:52.072868 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:31:52 crc kubenswrapper[4610]: I1006 09:31:52.573117 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:52 crc kubenswrapper[4610]: I1006 09:31:52.573169 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:52 crc kubenswrapper[4610]: I1006 09:31:52.639870 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:53 crc kubenswrapper[4610]: I1006 09:31:53.641512 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:53 crc kubenswrapper[4610]: I1006 09:31:53.688947 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:55 crc kubenswrapper[4610]: I1006 09:31:55.605340 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7b957" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="registry-server" containerID="cri-o://6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a" gracePeriod=2 Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.063324 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.242107 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content\") pod \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.242288 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities\") pod \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.242375 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpjnv\" (UniqueName: \"kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv\") pod \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\" (UID: \"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3\") " Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.244191 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities" (OuterVolumeSpecName: "utilities") pod "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" (UID: "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.248550 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv" (OuterVolumeSpecName: "kube-api-access-gpjnv") pod "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" (UID: "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3"). InnerVolumeSpecName "kube-api-access-gpjnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.291572 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" (UID: "55e6aa9a-687f-4422-8dd9-3f1c0f2269f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.345033 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.345399 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.345415 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpjnv\" (UniqueName: \"kubernetes.io/projected/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3-kube-api-access-gpjnv\") on node \"crc\" DevicePath \"\"" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.621329 4610 generic.go:334] "Generic (PLEG): container finished" podID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerID="6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a" exitCode=0 Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.621380 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerDied","Data":"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a"} Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.621414 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7b957" event={"ID":"55e6aa9a-687f-4422-8dd9-3f1c0f2269f3","Type":"ContainerDied","Data":"5a2a781dafe6ba910ea97a3a15936470db8401e70624e54a27dc6bc268510daf"} Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.621433 4610 scope.go:117] "RemoveContainer" containerID="6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.621582 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7b957" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.646176 4610 scope.go:117] "RemoveContainer" containerID="ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.673602 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.699514 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7b957"] Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.704468 4610 scope.go:117] "RemoveContainer" containerID="7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.728548 4610 scope.go:117] "RemoveContainer" containerID="6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a" Oct 06 09:31:56 crc kubenswrapper[4610]: E1006 09:31:56.728974 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a\": container with ID starting with 6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a not found: ID does not exist" containerID="6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.729015 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a"} err="failed to get container status \"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a\": rpc error: code = NotFound desc = could not find container \"6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a\": container with ID starting with 6d45e3d1c5a552f6d83b9d05d32dbd1138f858b053aead2ef506c95318f0c81a not found: ID does not exist" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.729037 4610 scope.go:117] "RemoveContainer" containerID="ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0" Oct 06 09:31:56 crc kubenswrapper[4610]: E1006 09:31:56.729326 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0\": container with ID starting with ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0 not found: ID does not exist" containerID="ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.729346 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0"} err="failed to get container status \"ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0\": rpc error: code = NotFound desc = could not find container \"ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0\": container with ID starting with ba679e0ab69e80a531988675f751cd5d7315a716d2757cf9faf4ffcb895f6ba0 not found: ID does not exist" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.729360 4610 scope.go:117] "RemoveContainer" containerID="7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6" Oct 06 09:31:56 crc kubenswrapper[4610]: E1006 09:31:56.729543 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6\": container with ID starting with 7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6 not found: ID does not exist" containerID="7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6" Oct 06 09:31:56 crc kubenswrapper[4610]: I1006 09:31:56.729563 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6"} err="failed to get container status \"7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6\": rpc error: code = NotFound desc = could not find container \"7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6\": container with ID starting with 7b023939a52bf2c0bd2bdb44b2b6e43213186bec689387f4137bb66a460039a6 not found: ID does not exist" Oct 06 09:31:57 crc kubenswrapper[4610]: I1006 09:31:57.085456 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" path="/var/lib/kubelet/pods/55e6aa9a-687f-4422-8dd9-3f1c0f2269f3/volumes" Oct 06 09:32:05 crc kubenswrapper[4610]: I1006 09:32:05.070525 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:32:05 crc kubenswrapper[4610]: E1006 09:32:05.071465 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.649747 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:32:17 crc kubenswrapper[4610]: E1006 09:32:17.651979 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="registry-server" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.652121 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="registry-server" Oct 06 09:32:17 crc kubenswrapper[4610]: E1006 09:32:17.652256 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="extract-content" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.652348 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="extract-content" Oct 06 09:32:17 crc kubenswrapper[4610]: E1006 09:32:17.652433 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="extract-utilities" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.652515 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="extract-utilities" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.652826 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e6aa9a-687f-4422-8dd9-3f1c0f2269f3" containerName="registry-server" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.653694 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.657474 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bpqlc" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.657674 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.659347 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.660566 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.678376 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.751827 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.751954 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.752123 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.854027 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.854375 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.854556 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.854672 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.854844 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.855298 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.856220 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.856450 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.856522 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.856642 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.856774 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfx6s\" (UniqueName: \"kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.865772 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958533 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958599 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958657 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfx6s\" (UniqueName: \"kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958785 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958875 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.958935 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.959415 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.959459 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.959674 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.972729 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.973205 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:17 crc kubenswrapper[4610]: I1006 09:32:17.977023 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfx6s\" (UniqueName: \"kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:18 crc kubenswrapper[4610]: I1006 09:32:18.005604 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " pod="openstack/tempest-tests-tempest" Oct 06 09:32:18 crc kubenswrapper[4610]: I1006 09:32:18.277813 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 09:32:18 crc kubenswrapper[4610]: I1006 09:32:18.800709 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 09:32:18 crc kubenswrapper[4610]: W1006 09:32:18.801564 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6effef24_402a_46e6_a15a_02815ef810ae.slice/crio-d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e WatchSource:0}: Error finding container d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e: Status 404 returned error can't find the container with id d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e Oct 06 09:32:18 crc kubenswrapper[4610]: I1006 09:32:18.873382 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6effef24-402a-46e6-a15a-02815ef810ae","Type":"ContainerStarted","Data":"d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e"} Oct 06 09:32:19 crc kubenswrapper[4610]: I1006 09:32:19.077669 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:32:19 crc kubenswrapper[4610]: E1006 09:32:19.077936 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:32:30 crc kubenswrapper[4610]: I1006 09:32:30.070882 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:32:30 crc kubenswrapper[4610]: E1006 09:32:30.071577 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:32:41 crc kubenswrapper[4610]: I1006 09:32:41.070634 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:32:41 crc kubenswrapper[4610]: E1006 09:32:41.071420 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:32:49 crc kubenswrapper[4610]: E1006 09:32:49.640961 4610 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 06 09:32:49 crc kubenswrapper[4610]: E1006 09:32:49.644806 4610 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfx6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(6effef24-402a-46e6-a15a-02815ef810ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 09:32:49 crc kubenswrapper[4610]: E1006 09:32:49.645992 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="6effef24-402a-46e6-a15a-02815ef810ae" Oct 06 09:32:50 crc kubenswrapper[4610]: E1006 09:32:50.190447 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="6effef24-402a-46e6-a15a-02815ef810ae" Oct 06 09:32:54 crc kubenswrapper[4610]: I1006 09:32:54.071281 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:32:54 crc kubenswrapper[4610]: E1006 09:32:54.071995 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:33:05 crc kubenswrapper[4610]: I1006 09:33:05.311254 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6effef24-402a-46e6-a15a-02815ef810ae","Type":"ContainerStarted","Data":"3857536c327114a3ff5db3acd0cbe8b95622569cc2358a5018afbc272436d061"} Oct 06 09:33:05 crc kubenswrapper[4610]: I1006 09:33:05.342591 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.596107945 podStartE2EDuration="49.342575449s" podCreationTimestamp="2025-10-06 09:32:16 +0000 UTC" firstStartedPulling="2025-10-06 09:32:18.805637801 +0000 UTC m=+3070.520691189" lastFinishedPulling="2025-10-06 09:33:03.552105295 +0000 UTC m=+3115.267158693" observedRunningTime="2025-10-06 09:33:05.339384105 +0000 UTC m=+3117.054437513" watchObservedRunningTime="2025-10-06 09:33:05.342575449 +0000 UTC m=+3117.057628837" Oct 06 09:33:08 crc kubenswrapper[4610]: I1006 09:33:08.070054 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:33:08 crc kubenswrapper[4610]: E1006 09:33:08.070789 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:33:19 crc kubenswrapper[4610]: I1006 09:33:19.082276 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:33:19 crc kubenswrapper[4610]: E1006 09:33:19.083011 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:33:30 crc kubenswrapper[4610]: I1006 09:33:30.070137 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:33:30 crc kubenswrapper[4610]: E1006 09:33:30.072263 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:33:43 crc kubenswrapper[4610]: I1006 09:33:43.074704 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:33:43 crc kubenswrapper[4610]: E1006 09:33:43.075626 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:33:54 crc kubenswrapper[4610]: I1006 09:33:54.070986 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:33:54 crc kubenswrapper[4610]: E1006 09:33:54.071669 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:34:08 crc kubenswrapper[4610]: I1006 09:34:08.070545 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:34:08 crc kubenswrapper[4610]: E1006 09:34:08.071240 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:34:22 crc kubenswrapper[4610]: I1006 09:34:22.074695 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:34:22 crc kubenswrapper[4610]: E1006 09:34:22.075814 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:34:35 crc kubenswrapper[4610]: I1006 09:34:35.070905 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:34:35 crc kubenswrapper[4610]: E1006 09:34:35.071621 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:34:48 crc kubenswrapper[4610]: I1006 09:34:48.070251 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:34:48 crc kubenswrapper[4610]: E1006 09:34:48.071248 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:35:01 crc kubenswrapper[4610]: I1006 09:35:01.071367 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:35:01 crc kubenswrapper[4610]: E1006 09:35:01.072265 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:35:14 crc kubenswrapper[4610]: I1006 09:35:14.070620 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:35:14 crc kubenswrapper[4610]: E1006 09:35:14.071563 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:35:29 crc kubenswrapper[4610]: I1006 09:35:29.077155 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:35:29 crc kubenswrapper[4610]: I1006 09:35:29.632126 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d"} Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.382662 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.385616 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.404136 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.501314 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6rw\" (UniqueName: \"kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.501377 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.501468 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.604532 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6rw\" (UniqueName: \"kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.604975 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.605461 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.606305 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.606622 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.628927 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6rw\" (UniqueName: \"kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw\") pod \"redhat-operators-fw8t4\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:02 crc kubenswrapper[4610]: I1006 09:36:02.708551 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:03 crc kubenswrapper[4610]: I1006 09:36:03.633160 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:03 crc kubenswrapper[4610]: I1006 09:36:03.946716 4610 generic.go:334] "Generic (PLEG): container finished" podID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerID="27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2" exitCode=0 Oct 06 09:36:03 crc kubenswrapper[4610]: I1006 09:36:03.946800 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerDied","Data":"27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2"} Oct 06 09:36:03 crc kubenswrapper[4610]: I1006 09:36:03.946879 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerStarted","Data":"4989998e7ceaffcd4cc93bc23996275bd1c5fba9c9ddcf19cb99170f78402a35"} Oct 06 09:36:05 crc kubenswrapper[4610]: I1006 09:36:05.966196 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerStarted","Data":"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f"} Oct 06 09:36:10 crc kubenswrapper[4610]: I1006 09:36:10.007460 4610 generic.go:334] "Generic (PLEG): container finished" podID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerID="96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f" exitCode=0 Oct 06 09:36:10 crc kubenswrapper[4610]: I1006 09:36:10.007538 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerDied","Data":"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f"} Oct 06 09:36:11 crc kubenswrapper[4610]: I1006 09:36:11.017815 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerStarted","Data":"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff"} Oct 06 09:36:12 crc kubenswrapper[4610]: I1006 09:36:12.713510 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:12 crc kubenswrapper[4610]: I1006 09:36:12.713926 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:13 crc kubenswrapper[4610]: I1006 09:36:13.759008 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw8t4" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:36:13 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:36:13 crc kubenswrapper[4610]: > Oct 06 09:36:23 crc kubenswrapper[4610]: I1006 09:36:23.762780 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw8t4" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:36:23 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:36:23 crc kubenswrapper[4610]: > Oct 06 09:36:33 crc kubenswrapper[4610]: I1006 09:36:33.771908 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw8t4" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:36:33 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:36:33 crc kubenswrapper[4610]: > Oct 06 09:36:43 crc kubenswrapper[4610]: I1006 09:36:43.758381 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw8t4" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:36:43 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:36:43 crc kubenswrapper[4610]: > Oct 06 09:36:52 crc kubenswrapper[4610]: I1006 09:36:52.762654 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:52 crc kubenswrapper[4610]: I1006 09:36:52.801021 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fw8t4" podStartSLOduration=44.20358344 podStartE2EDuration="50.800992149s" podCreationTimestamp="2025-10-06 09:36:02 +0000 UTC" firstStartedPulling="2025-10-06 09:36:03.948519014 +0000 UTC m=+3295.663572412" lastFinishedPulling="2025-10-06 09:36:10.545927733 +0000 UTC m=+3302.260981121" observedRunningTime="2025-10-06 09:36:11.04089038 +0000 UTC m=+3302.755943768" watchObservedRunningTime="2025-10-06 09:36:52.800992149 +0000 UTC m=+3344.516045547" Oct 06 09:36:52 crc kubenswrapper[4610]: I1006 09:36:52.819760 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:53 crc kubenswrapper[4610]: I1006 09:36:53.014612 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:54 crc kubenswrapper[4610]: I1006 09:36:54.445987 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fw8t4" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" containerID="cri-o://e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff" gracePeriod=2 Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.032101 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.090958 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt6rw\" (UniqueName: \"kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw\") pod \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.091212 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content\") pod \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.091295 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities\") pod \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\" (UID: \"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a\") " Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.092198 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities" (OuterVolumeSpecName: "utilities") pod "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" (UID: "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.105423 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw" (OuterVolumeSpecName: "kube-api-access-jt6rw") pod "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" (UID: "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a"). InnerVolumeSpecName "kube-api-access-jt6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.194319 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.194711 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt6rw\" (UniqueName: \"kubernetes.io/projected/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-kube-api-access-jt6rw\") on node \"crc\" DevicePath \"\"" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.210240 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" (UID: "7f5c7013-e593-4dc2-bc7c-2b3b86012b4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.296422 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.457159 4610 generic.go:334] "Generic (PLEG): container finished" podID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerID="e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff" exitCode=0 Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.457199 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerDied","Data":"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff"} Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.457237 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8t4" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.457258 4610 scope.go:117] "RemoveContainer" containerID="e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.457243 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8t4" event={"ID":"7f5c7013-e593-4dc2-bc7c-2b3b86012b4a","Type":"ContainerDied","Data":"4989998e7ceaffcd4cc93bc23996275bd1c5fba9c9ddcf19cb99170f78402a35"} Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.478420 4610 scope.go:117] "RemoveContainer" containerID="96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.499727 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.506683 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fw8t4"] Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.524340 4610 scope.go:117] "RemoveContainer" containerID="27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.580327 4610 scope.go:117] "RemoveContainer" containerID="e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff" Oct 06 09:36:55 crc kubenswrapper[4610]: E1006 09:36:55.580951 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff\": container with ID starting with e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff not found: ID does not exist" containerID="e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.581019 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff"} err="failed to get container status \"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff\": rpc error: code = NotFound desc = could not find container \"e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff\": container with ID starting with e0ce59dbb2e517af37d9464a5a3ae9e01167cafc609e14f0b76d0ae1fabbb9ff not found: ID does not exist" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.581107 4610 scope.go:117] "RemoveContainer" containerID="96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f" Oct 06 09:36:55 crc kubenswrapper[4610]: E1006 09:36:55.581595 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f\": container with ID starting with 96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f not found: ID does not exist" containerID="96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.581652 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f"} err="failed to get container status \"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f\": rpc error: code = NotFound desc = could not find container \"96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f\": container with ID starting with 96ce67f47f7cc33c51fff5e8fb48128b91189d4eada40ea0d4153379915e592f not found: ID does not exist" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.581697 4610 scope.go:117] "RemoveContainer" containerID="27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2" Oct 06 09:36:55 crc kubenswrapper[4610]: E1006 09:36:55.582095 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2\": container with ID starting with 27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2 not found: ID does not exist" containerID="27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2" Oct 06 09:36:55 crc kubenswrapper[4610]: I1006 09:36:55.582129 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2"} err="failed to get container status \"27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2\": rpc error: code = NotFound desc = could not find container \"27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2\": container with ID starting with 27b7686f0215f07565423d266eded13b7c0ff2f044e03e8b07cd48130f120ab2 not found: ID does not exist" Oct 06 09:36:57 crc kubenswrapper[4610]: I1006 09:36:57.081104 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" path="/var/lib/kubelet/pods/7f5c7013-e593-4dc2-bc7c-2b3b86012b4a/volumes" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.055494 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:41 crc kubenswrapper[4610]: E1006 09:37:41.057462 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="extract-content" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.057561 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="extract-content" Oct 06 09:37:41 crc kubenswrapper[4610]: E1006 09:37:41.057645 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="extract-utilities" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.057723 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="extract-utilities" Oct 06 09:37:41 crc kubenswrapper[4610]: E1006 09:37:41.057821 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.057897 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.058205 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5c7013-e593-4dc2-bc7c-2b3b86012b4a" containerName="registry-server" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.059982 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.093514 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.237034 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.237152 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.237284 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtjct\" (UniqueName: \"kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.339234 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtjct\" (UniqueName: \"kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.339644 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.339782 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.340164 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.340325 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.363688 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtjct\" (UniqueName: \"kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct\") pod \"redhat-marketplace-v8p9h\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.392535 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.867300 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:41 crc kubenswrapper[4610]: I1006 09:37:41.908582 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerStarted","Data":"750995d96ec3e572d748cec7a80ffebe326772ec52fc99be7752da7c45b87320"} Oct 06 09:37:42 crc kubenswrapper[4610]: I1006 09:37:42.920227 4610 generic.go:334] "Generic (PLEG): container finished" podID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerID="e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393" exitCode=0 Oct 06 09:37:42 crc kubenswrapper[4610]: I1006 09:37:42.920348 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerDied","Data":"e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393"} Oct 06 09:37:42 crc kubenswrapper[4610]: I1006 09:37:42.922614 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:37:43 crc kubenswrapper[4610]: I1006 09:37:43.931831 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerStarted","Data":"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e"} Oct 06 09:37:44 crc kubenswrapper[4610]: I1006 09:37:44.943913 4610 generic.go:334] "Generic (PLEG): container finished" podID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerID="84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e" exitCode=0 Oct 06 09:37:44 crc kubenswrapper[4610]: I1006 09:37:44.943973 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerDied","Data":"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e"} Oct 06 09:37:45 crc kubenswrapper[4610]: I1006 09:37:45.954665 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerStarted","Data":"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2"} Oct 06 09:37:46 crc kubenswrapper[4610]: I1006 09:37:46.469783 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:37:46 crc kubenswrapper[4610]: I1006 09:37:46.469889 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:37:51 crc kubenswrapper[4610]: I1006 09:37:51.392757 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:51 crc kubenswrapper[4610]: I1006 09:37:51.395052 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:51 crc kubenswrapper[4610]: I1006 09:37:51.440236 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:51 crc kubenswrapper[4610]: I1006 09:37:51.466139 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v8p9h" podStartSLOduration=8.034328807 podStartE2EDuration="10.466110937s" podCreationTimestamp="2025-10-06 09:37:41 +0000 UTC" firstStartedPulling="2025-10-06 09:37:42.922338175 +0000 UTC m=+3394.637391563" lastFinishedPulling="2025-10-06 09:37:45.354120305 +0000 UTC m=+3397.069173693" observedRunningTime="2025-10-06 09:37:45.979517383 +0000 UTC m=+3397.694570771" watchObservedRunningTime="2025-10-06 09:37:51.466110937 +0000 UTC m=+3403.181164335" Oct 06 09:37:52 crc kubenswrapper[4610]: I1006 09:37:52.064271 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:52 crc kubenswrapper[4610]: I1006 09:37:52.117969 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.028479 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v8p9h" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="registry-server" containerID="cri-o://b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2" gracePeriod=2 Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.743907 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.909652 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content\") pod \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.909850 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtjct\" (UniqueName: \"kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct\") pod \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.909892 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities\") pod \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\" (UID: \"3ef6b951-9559-447f-8b8b-f7a7a8023f16\") " Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.910634 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities" (OuterVolumeSpecName: "utilities") pod "3ef6b951-9559-447f-8b8b-f7a7a8023f16" (UID: "3ef6b951-9559-447f-8b8b-f7a7a8023f16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.918221 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct" (OuterVolumeSpecName: "kube-api-access-rtjct") pod "3ef6b951-9559-447f-8b8b-f7a7a8023f16" (UID: "3ef6b951-9559-447f-8b8b-f7a7a8023f16"). InnerVolumeSpecName "kube-api-access-rtjct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:37:54 crc kubenswrapper[4610]: I1006 09:37:54.922322 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ef6b951-9559-447f-8b8b-f7a7a8023f16" (UID: "3ef6b951-9559-447f-8b8b-f7a7a8023f16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.011929 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.011968 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtjct\" (UniqueName: \"kubernetes.io/projected/3ef6b951-9559-447f-8b8b-f7a7a8023f16-kube-api-access-rtjct\") on node \"crc\" DevicePath \"\"" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.011983 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef6b951-9559-447f-8b8b-f7a7a8023f16-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.044703 4610 generic.go:334] "Generic (PLEG): container finished" podID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerID="b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2" exitCode=0 Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.044761 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerDied","Data":"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2"} Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.044786 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8p9h" event={"ID":"3ef6b951-9559-447f-8b8b-f7a7a8023f16","Type":"ContainerDied","Data":"750995d96ec3e572d748cec7a80ffebe326772ec52fc99be7752da7c45b87320"} Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.044807 4610 scope.go:117] "RemoveContainer" containerID="b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.044815 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8p9h" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.065308 4610 scope.go:117] "RemoveContainer" containerID="84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.109402 4610 scope.go:117] "RemoveContainer" containerID="e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.126425 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.162449 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8p9h"] Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.188272 4610 scope.go:117] "RemoveContainer" containerID="b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2" Oct 06 09:37:55 crc kubenswrapper[4610]: E1006 09:37:55.188878 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2\": container with ID starting with b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2 not found: ID does not exist" containerID="b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.188923 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2"} err="failed to get container status \"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2\": rpc error: code = NotFound desc = could not find container \"b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2\": container with ID starting with b1135342a7bd805703998d5709db3259d146e41b979a71a6dc3046d569f4dbf2 not found: ID does not exist" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.188952 4610 scope.go:117] "RemoveContainer" containerID="84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e" Oct 06 09:37:55 crc kubenswrapper[4610]: E1006 09:37:55.191360 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e\": container with ID starting with 84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e not found: ID does not exist" containerID="84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.191393 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e"} err="failed to get container status \"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e\": rpc error: code = NotFound desc = could not find container \"84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e\": container with ID starting with 84f6d0424c02b16dab789cf13ba54890ce28b307c23828e7c2a40b818ba2d52e not found: ID does not exist" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.191412 4610 scope.go:117] "RemoveContainer" containerID="e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393" Oct 06 09:37:55 crc kubenswrapper[4610]: E1006 09:37:55.194314 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393\": container with ID starting with e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393 not found: ID does not exist" containerID="e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393" Oct 06 09:37:55 crc kubenswrapper[4610]: I1006 09:37:55.194340 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393"} err="failed to get container status \"e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393\": rpc error: code = NotFound desc = could not find container \"e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393\": container with ID starting with e136e25ce2dd6565788f350402e1331fe1142a4649155473f89d5ef279156393 not found: ID does not exist" Oct 06 09:37:57 crc kubenswrapper[4610]: I1006 09:37:57.081904 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" path="/var/lib/kubelet/pods/3ef6b951-9559-447f-8b8b-f7a7a8023f16/volumes" Oct 06 09:38:16 crc kubenswrapper[4610]: I1006 09:38:16.469263 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:38:16 crc kubenswrapper[4610]: I1006 09:38:16.469767 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:38:46 crc kubenswrapper[4610]: I1006 09:38:46.469397 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:38:46 crc kubenswrapper[4610]: I1006 09:38:46.469792 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:38:46 crc kubenswrapper[4610]: I1006 09:38:46.469832 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:38:46 crc kubenswrapper[4610]: I1006 09:38:46.470512 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:38:46 crc kubenswrapper[4610]: I1006 09:38:46.470561 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d" gracePeriod=600 Oct 06 09:38:47 crc kubenswrapper[4610]: I1006 09:38:47.481158 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d" exitCode=0 Oct 06 09:38:47 crc kubenswrapper[4610]: I1006 09:38:47.481207 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d"} Oct 06 09:38:47 crc kubenswrapper[4610]: I1006 09:38:47.481660 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432"} Oct 06 09:38:47 crc kubenswrapper[4610]: I1006 09:38:47.481699 4610 scope.go:117] "RemoveContainer" containerID="1f0e85bb05159fafc6b2af098953c228ba1aa664e4ca1f0eaccaac3938fdd22d" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.246931 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:38:50 crc kubenswrapper[4610]: E1006 09:38:50.247990 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="registry-server" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.248006 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="registry-server" Oct 06 09:38:50 crc kubenswrapper[4610]: E1006 09:38:50.248040 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="extract-utilities" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.248068 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="extract-utilities" Oct 06 09:38:50 crc kubenswrapper[4610]: E1006 09:38:50.248083 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="extract-content" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.248091 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="extract-content" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.248306 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef6b951-9559-447f-8b8b-f7a7a8023f16" containerName="registry-server" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.250004 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.268189 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.378292 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.378386 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.378488 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhrlh\" (UniqueName: \"kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.479768 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhrlh\" (UniqueName: \"kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.480265 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.480755 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.480875 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.481194 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.502727 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhrlh\" (UniqueName: \"kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh\") pod \"community-operators-9tllk\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:50 crc kubenswrapper[4610]: I1006 09:38:50.603861 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:38:51 crc kubenswrapper[4610]: I1006 09:38:51.235138 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:38:51 crc kubenswrapper[4610]: I1006 09:38:51.534071 4610 generic.go:334] "Generic (PLEG): container finished" podID="a408ec83-a360-4a73-a852-7024878dd298" containerID="ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073" exitCode=0 Oct 06 09:38:51 crc kubenswrapper[4610]: I1006 09:38:51.534123 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerDied","Data":"ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073"} Oct 06 09:38:51 crc kubenswrapper[4610]: I1006 09:38:51.534153 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerStarted","Data":"69170f62dbe887226715d166fc9a50eb45e98a376fb0cbfec8eb0cb501964e7b"} Oct 06 09:38:52 crc kubenswrapper[4610]: I1006 09:38:52.543266 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerStarted","Data":"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce"} Oct 06 09:38:54 crc kubenswrapper[4610]: I1006 09:38:54.561860 4610 generic.go:334] "Generic (PLEG): container finished" podID="a408ec83-a360-4a73-a852-7024878dd298" containerID="23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce" exitCode=0 Oct 06 09:38:54 crc kubenswrapper[4610]: I1006 09:38:54.562015 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerDied","Data":"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce"} Oct 06 09:38:55 crc kubenswrapper[4610]: I1006 09:38:55.573122 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerStarted","Data":"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac"} Oct 06 09:38:55 crc kubenswrapper[4610]: I1006 09:38:55.596237 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tllk" podStartSLOduration=2.105829656 podStartE2EDuration="5.596221541s" podCreationTimestamp="2025-10-06 09:38:50 +0000 UTC" firstStartedPulling="2025-10-06 09:38:51.536064447 +0000 UTC m=+3463.251117835" lastFinishedPulling="2025-10-06 09:38:55.026456332 +0000 UTC m=+3466.741509720" observedRunningTime="2025-10-06 09:38:55.592137704 +0000 UTC m=+3467.307191092" watchObservedRunningTime="2025-10-06 09:38:55.596221541 +0000 UTC m=+3467.311274929" Oct 06 09:39:00 crc kubenswrapper[4610]: I1006 09:39:00.604505 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:00 crc kubenswrapper[4610]: I1006 09:39:00.606206 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:01 crc kubenswrapper[4610]: I1006 09:39:01.654130 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9tllk" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="registry-server" probeResult="failure" output=< Oct 06 09:39:01 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:39:01 crc kubenswrapper[4610]: > Oct 06 09:39:10 crc kubenswrapper[4610]: I1006 09:39:10.657409 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:10 crc kubenswrapper[4610]: I1006 09:39:10.713795 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:10 crc kubenswrapper[4610]: I1006 09:39:10.891504 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:39:11 crc kubenswrapper[4610]: I1006 09:39:11.725530 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tllk" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="registry-server" containerID="cri-o://8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac" gracePeriod=2 Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.271664 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.339396 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhrlh\" (UniqueName: \"kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh\") pod \"a408ec83-a360-4a73-a852-7024878dd298\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.339563 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities\") pod \"a408ec83-a360-4a73-a852-7024878dd298\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.339624 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content\") pod \"a408ec83-a360-4a73-a852-7024878dd298\" (UID: \"a408ec83-a360-4a73-a852-7024878dd298\") " Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.341133 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities" (OuterVolumeSpecName: "utilities") pod "a408ec83-a360-4a73-a852-7024878dd298" (UID: "a408ec83-a360-4a73-a852-7024878dd298"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.346763 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh" (OuterVolumeSpecName: "kube-api-access-vhrlh") pod "a408ec83-a360-4a73-a852-7024878dd298" (UID: "a408ec83-a360-4a73-a852-7024878dd298"). InnerVolumeSpecName "kube-api-access-vhrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.347250 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.347270 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhrlh\" (UniqueName: \"kubernetes.io/projected/a408ec83-a360-4a73-a852-7024878dd298-kube-api-access-vhrlh\") on node \"crc\" DevicePath \"\"" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.394887 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a408ec83-a360-4a73-a852-7024878dd298" (UID: "a408ec83-a360-4a73-a852-7024878dd298"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.448282 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a408ec83-a360-4a73-a852-7024878dd298-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.741761 4610 generic.go:334] "Generic (PLEG): container finished" podID="a408ec83-a360-4a73-a852-7024878dd298" containerID="8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac" exitCode=0 Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.741831 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerDied","Data":"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac"} Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.741879 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tllk" event={"ID":"a408ec83-a360-4a73-a852-7024878dd298","Type":"ContainerDied","Data":"69170f62dbe887226715d166fc9a50eb45e98a376fb0cbfec8eb0cb501964e7b"} Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.741910 4610 scope.go:117] "RemoveContainer" containerID="8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.743367 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tllk" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.766948 4610 scope.go:117] "RemoveContainer" containerID="23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.795715 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.800910 4610 scope.go:117] "RemoveContainer" containerID="ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.801860 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tllk"] Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.882618 4610 scope.go:117] "RemoveContainer" containerID="8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac" Oct 06 09:39:12 crc kubenswrapper[4610]: E1006 09:39:12.883031 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac\": container with ID starting with 8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac not found: ID does not exist" containerID="8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.883114 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac"} err="failed to get container status \"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac\": rpc error: code = NotFound desc = could not find container \"8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac\": container with ID starting with 8cdf66fe102bcaf642d77d1014d1cbd3936ba8e6466a5f05faa0b36c25caf0ac not found: ID does not exist" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.883147 4610 scope.go:117] "RemoveContainer" containerID="23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce" Oct 06 09:39:12 crc kubenswrapper[4610]: E1006 09:39:12.883479 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce\": container with ID starting with 23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce not found: ID does not exist" containerID="23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.883508 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce"} err="failed to get container status \"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce\": rpc error: code = NotFound desc = could not find container \"23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce\": container with ID starting with 23ea947f50a8aa24ffcbc284d627cbe124986373659ed915c4a1f76ac3d876ce not found: ID does not exist" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.883527 4610 scope.go:117] "RemoveContainer" containerID="ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073" Oct 06 09:39:12 crc kubenswrapper[4610]: E1006 09:39:12.883786 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073\": container with ID starting with ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073 not found: ID does not exist" containerID="ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073" Oct 06 09:39:12 crc kubenswrapper[4610]: I1006 09:39:12.883821 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073"} err="failed to get container status \"ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073\": rpc error: code = NotFound desc = could not find container \"ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073\": container with ID starting with ec8575b1f5ffdd7d0b312998a0afd691f70c2f0ff9d2d0f7dd0ad11f95cf8073 not found: ID does not exist" Oct 06 09:39:13 crc kubenswrapper[4610]: I1006 09:39:13.080653 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a408ec83-a360-4a73-a852-7024878dd298" path="/var/lib/kubelet/pods/a408ec83-a360-4a73-a852-7024878dd298/volumes" Oct 06 09:40:46 crc kubenswrapper[4610]: I1006 09:40:46.469479 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:40:46 crc kubenswrapper[4610]: I1006 09:40:46.469915 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:41:16 crc kubenswrapper[4610]: I1006 09:41:16.469922 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:41:16 crc kubenswrapper[4610]: I1006 09:41:16.470404 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:41:46 crc kubenswrapper[4610]: I1006 09:41:46.468886 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:41:46 crc kubenswrapper[4610]: I1006 09:41:46.469309 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:41:46 crc kubenswrapper[4610]: I1006 09:41:46.469349 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:41:46 crc kubenswrapper[4610]: I1006 09:41:46.469998 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:41:46 crc kubenswrapper[4610]: I1006 09:41:46.470038 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" gracePeriod=600 Oct 06 09:41:46 crc kubenswrapper[4610]: E1006 09:41:46.682638 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:41:47 crc kubenswrapper[4610]: I1006 09:41:47.168669 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" exitCode=0 Oct 06 09:41:47 crc kubenswrapper[4610]: I1006 09:41:47.168714 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432"} Oct 06 09:41:47 crc kubenswrapper[4610]: I1006 09:41:47.168746 4610 scope.go:117] "RemoveContainer" containerID="bc5e9fba468745d692775ab2b2689bceb84e65ce71a3475f4f0f38b55f42e31d" Oct 06 09:41:47 crc kubenswrapper[4610]: I1006 09:41:47.169371 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:41:47 crc kubenswrapper[4610]: E1006 09:41:47.169601 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:41:58 crc kubenswrapper[4610]: I1006 09:41:58.112713 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:41:58 crc kubenswrapper[4610]: E1006 09:41:58.113306 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:42:10 crc kubenswrapper[4610]: I1006 09:42:10.070294 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:42:10 crc kubenswrapper[4610]: E1006 09:42:10.071181 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:42:21 crc kubenswrapper[4610]: I1006 09:42:21.074291 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:42:21 crc kubenswrapper[4610]: E1006 09:42:21.075130 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:42:35 crc kubenswrapper[4610]: I1006 09:42:35.070196 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:42:35 crc kubenswrapper[4610]: E1006 09:42:35.070773 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:42:49 crc kubenswrapper[4610]: I1006 09:42:49.082106 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:42:49 crc kubenswrapper[4610]: E1006 09:42:49.082628 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:43:02 crc kubenswrapper[4610]: I1006 09:43:02.070897 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:43:02 crc kubenswrapper[4610]: E1006 09:43:02.071596 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:43:17 crc kubenswrapper[4610]: I1006 09:43:17.070743 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:43:17 crc kubenswrapper[4610]: E1006 09:43:17.073253 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:43:29 crc kubenswrapper[4610]: I1006 09:43:29.079948 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:43:29 crc kubenswrapper[4610]: E1006 09:43:29.080892 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:43:44 crc kubenswrapper[4610]: I1006 09:43:44.071616 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:43:44 crc kubenswrapper[4610]: E1006 09:43:44.075076 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.021601 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:43:50 crc kubenswrapper[4610]: E1006 09:43:50.022325 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="extract-utilities" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.022343 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="extract-utilities" Oct 06 09:43:50 crc kubenswrapper[4610]: E1006 09:43:50.022377 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="registry-server" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.022385 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="registry-server" Oct 06 09:43:50 crc kubenswrapper[4610]: E1006 09:43:50.022413 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="extract-content" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.022421 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="extract-content" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.022629 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="a408ec83-a360-4a73-a852-7024878dd298" containerName="registry-server" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.024185 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.041211 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.045897 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.046166 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xjr\" (UniqueName: \"kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.046292 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.147741 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.147796 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xjr\" (UniqueName: \"kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.147862 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.148367 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.149033 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.180378 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xjr\" (UniqueName: \"kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr\") pod \"certified-operators-plqn9\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.344457 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:43:50 crc kubenswrapper[4610]: I1006 09:43:50.936793 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:43:51 crc kubenswrapper[4610]: I1006 09:43:51.369960 4610 generic.go:334] "Generic (PLEG): container finished" podID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerID="6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2" exitCode=0 Oct 06 09:43:51 crc kubenswrapper[4610]: I1006 09:43:51.370002 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerDied","Data":"6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2"} Oct 06 09:43:51 crc kubenswrapper[4610]: I1006 09:43:51.370030 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerStarted","Data":"cfb71af6d02aeeb25191a2f8c3f3c45bc2fade6f12da06483013d70564c1c1d6"} Oct 06 09:43:51 crc kubenswrapper[4610]: I1006 09:43:51.371563 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:43:52 crc kubenswrapper[4610]: I1006 09:43:52.382007 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerStarted","Data":"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546"} Oct 06 09:43:54 crc kubenswrapper[4610]: I1006 09:43:54.400921 4610 generic.go:334] "Generic (PLEG): container finished" podID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerID="17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546" exitCode=0 Oct 06 09:43:54 crc kubenswrapper[4610]: I1006 09:43:54.401019 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerDied","Data":"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546"} Oct 06 09:43:55 crc kubenswrapper[4610]: I1006 09:43:55.410995 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerStarted","Data":"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de"} Oct 06 09:43:55 crc kubenswrapper[4610]: I1006 09:43:55.439914 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-plqn9" podStartSLOduration=3.018588903 podStartE2EDuration="6.439891076s" podCreationTimestamp="2025-10-06 09:43:49 +0000 UTC" firstStartedPulling="2025-10-06 09:43:51.371284546 +0000 UTC m=+3763.086337934" lastFinishedPulling="2025-10-06 09:43:54.792586719 +0000 UTC m=+3766.507640107" observedRunningTime="2025-10-06 09:43:55.430709306 +0000 UTC m=+3767.145762704" watchObservedRunningTime="2025-10-06 09:43:55.439891076 +0000 UTC m=+3767.154944474" Oct 06 09:43:56 crc kubenswrapper[4610]: I1006 09:43:56.071160 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:43:56 crc kubenswrapper[4610]: E1006 09:43:56.072853 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:44:00 crc kubenswrapper[4610]: I1006 09:44:00.345633 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:00 crc kubenswrapper[4610]: I1006 09:44:00.346214 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:00 crc kubenswrapper[4610]: I1006 09:44:00.418223 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:00 crc kubenswrapper[4610]: I1006 09:44:00.500148 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:00 crc kubenswrapper[4610]: I1006 09:44:00.651610 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:44:02 crc kubenswrapper[4610]: I1006 09:44:02.471806 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-plqn9" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="registry-server" containerID="cri-o://a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de" gracePeriod=2 Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.006317 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.149888 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content\") pod \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.150226 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities\") pod \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.150327 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88xjr\" (UniqueName: \"kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr\") pod \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\" (UID: \"7955eed3-cae6-4ecf-9bea-3eda0cc66a52\") " Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.151948 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities" (OuterVolumeSpecName: "utilities") pod "7955eed3-cae6-4ecf-9bea-3eda0cc66a52" (UID: "7955eed3-cae6-4ecf-9bea-3eda0cc66a52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.166418 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr" (OuterVolumeSpecName: "kube-api-access-88xjr") pod "7955eed3-cae6-4ecf-9bea-3eda0cc66a52" (UID: "7955eed3-cae6-4ecf-9bea-3eda0cc66a52"). InnerVolumeSpecName "kube-api-access-88xjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.206562 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7955eed3-cae6-4ecf-9bea-3eda0cc66a52" (UID: "7955eed3-cae6-4ecf-9bea-3eda0cc66a52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.252807 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.252853 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.252868 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88xjr\" (UniqueName: \"kubernetes.io/projected/7955eed3-cae6-4ecf-9bea-3eda0cc66a52-kube-api-access-88xjr\") on node \"crc\" DevicePath \"\"" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.481443 4610 generic.go:334] "Generic (PLEG): container finished" podID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerID="a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de" exitCode=0 Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.481511 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plqn9" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.481524 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerDied","Data":"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de"} Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.482848 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plqn9" event={"ID":"7955eed3-cae6-4ecf-9bea-3eda0cc66a52","Type":"ContainerDied","Data":"cfb71af6d02aeeb25191a2f8c3f3c45bc2fade6f12da06483013d70564c1c1d6"} Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.482922 4610 scope.go:117] "RemoveContainer" containerID="a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.509027 4610 scope.go:117] "RemoveContainer" containerID="17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.522713 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.534419 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-plqn9"] Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.537988 4610 scope.go:117] "RemoveContainer" containerID="6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.579173 4610 scope.go:117] "RemoveContainer" containerID="a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de" Oct 06 09:44:03 crc kubenswrapper[4610]: E1006 09:44:03.579695 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de\": container with ID starting with a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de not found: ID does not exist" containerID="a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.579827 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de"} err="failed to get container status \"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de\": rpc error: code = NotFound desc = could not find container \"a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de\": container with ID starting with a0e47d027ad495fd648ead9baf9f1113f2f3b243be0b1242ed84fa96570697de not found: ID does not exist" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.579944 4610 scope.go:117] "RemoveContainer" containerID="17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546" Oct 06 09:44:03 crc kubenswrapper[4610]: E1006 09:44:03.580451 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546\": container with ID starting with 17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546 not found: ID does not exist" containerID="17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.580485 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546"} err="failed to get container status \"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546\": rpc error: code = NotFound desc = could not find container \"17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546\": container with ID starting with 17293f0b754c06354763264707be721b2c7de4f20030b52ea4361b2f29db1546 not found: ID does not exist" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.580506 4610 scope.go:117] "RemoveContainer" containerID="6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2" Oct 06 09:44:03 crc kubenswrapper[4610]: E1006 09:44:03.580868 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2\": container with ID starting with 6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2 not found: ID does not exist" containerID="6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2" Oct 06 09:44:03 crc kubenswrapper[4610]: I1006 09:44:03.580948 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2"} err="failed to get container status \"6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2\": rpc error: code = NotFound desc = could not find container \"6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2\": container with ID starting with 6bd9830e77ddc8dfe58a8cdd4ab38bf7583bfb285d084e415e62b4389fce40c2 not found: ID does not exist" Oct 06 09:44:05 crc kubenswrapper[4610]: I1006 09:44:05.080078 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" path="/var/lib/kubelet/pods/7955eed3-cae6-4ecf-9bea-3eda0cc66a52/volumes" Oct 06 09:44:09 crc kubenswrapper[4610]: I1006 09:44:09.076349 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:44:09 crc kubenswrapper[4610]: E1006 09:44:09.076946 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:44:21 crc kubenswrapper[4610]: I1006 09:44:21.074702 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:44:21 crc kubenswrapper[4610]: E1006 09:44:21.075458 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:44:35 crc kubenswrapper[4610]: I1006 09:44:35.070552 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:44:35 crc kubenswrapper[4610]: E1006 09:44:35.071620 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:44:47 crc kubenswrapper[4610]: I1006 09:44:47.070130 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:44:47 crc kubenswrapper[4610]: E1006 09:44:47.070920 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.070081 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:45:00 crc kubenswrapper[4610]: E1006 09:45:00.070916 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.221003 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg"] Oct 06 09:45:00 crc kubenswrapper[4610]: E1006 09:45:00.221467 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="extract-utilities" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.221488 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="extract-utilities" Oct 06 09:45:00 crc kubenswrapper[4610]: E1006 09:45:00.221497 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="registry-server" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.221504 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="registry-server" Oct 06 09:45:00 crc kubenswrapper[4610]: E1006 09:45:00.221522 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="extract-content" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.221528 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="extract-content" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.221714 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7955eed3-cae6-4ecf-9bea-3eda0cc66a52" containerName="registry-server" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.225802 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.230304 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.235799 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.236597 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg"] Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.305112 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.305233 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.305287 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf4h\" (UniqueName: \"kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.406822 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.406924 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf4h\" (UniqueName: \"kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.407172 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.407827 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.414630 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.432398 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf4h\" (UniqueName: \"kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h\") pod \"collect-profiles-29329065-bgrsg\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:00 crc kubenswrapper[4610]: I1006 09:45:00.546768 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:01 crc kubenswrapper[4610]: I1006 09:45:01.018604 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg"] Oct 06 09:45:01 crc kubenswrapper[4610]: I1006 09:45:01.992774 4610 generic.go:334] "Generic (PLEG): container finished" podID="410c6384-a872-4866-90a6-5b64de86dd39" containerID="1db949d2213b9480702fa296b589c37a59738ee06dd62fa2e3fb41b63d6211b3" exitCode=0 Oct 06 09:45:01 crc kubenswrapper[4610]: I1006 09:45:01.992822 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" event={"ID":"410c6384-a872-4866-90a6-5b64de86dd39","Type":"ContainerDied","Data":"1db949d2213b9480702fa296b589c37a59738ee06dd62fa2e3fb41b63d6211b3"} Oct 06 09:45:01 crc kubenswrapper[4610]: I1006 09:45:01.993173 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" event={"ID":"410c6384-a872-4866-90a6-5b64de86dd39","Type":"ContainerStarted","Data":"90388b72d2926330cfb0ff70492add8f8e00a69a0c44b5cbc9658f2674a9fa29"} Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.478541 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.565158 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume\") pod \"410c6384-a872-4866-90a6-5b64de86dd39\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.565529 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume\") pod \"410c6384-a872-4866-90a6-5b64de86dd39\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.565815 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume" (OuterVolumeSpecName: "config-volume") pod "410c6384-a872-4866-90a6-5b64de86dd39" (UID: "410c6384-a872-4866-90a6-5b64de86dd39"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.565837 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zf4h\" (UniqueName: \"kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h\") pod \"410c6384-a872-4866-90a6-5b64de86dd39\" (UID: \"410c6384-a872-4866-90a6-5b64de86dd39\") " Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.566767 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c6384-a872-4866-90a6-5b64de86dd39-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.579324 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "410c6384-a872-4866-90a6-5b64de86dd39" (UID: "410c6384-a872-4866-90a6-5b64de86dd39"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.581190 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h" (OuterVolumeSpecName: "kube-api-access-2zf4h") pod "410c6384-a872-4866-90a6-5b64de86dd39" (UID: "410c6384-a872-4866-90a6-5b64de86dd39"). InnerVolumeSpecName "kube-api-access-2zf4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.668892 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zf4h\" (UniqueName: \"kubernetes.io/projected/410c6384-a872-4866-90a6-5b64de86dd39-kube-api-access-2zf4h\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:03 crc kubenswrapper[4610]: I1006 09:45:03.668922 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/410c6384-a872-4866-90a6-5b64de86dd39-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:45:04 crc kubenswrapper[4610]: I1006 09:45:04.015272 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" event={"ID":"410c6384-a872-4866-90a6-5b64de86dd39","Type":"ContainerDied","Data":"90388b72d2926330cfb0ff70492add8f8e00a69a0c44b5cbc9658f2674a9fa29"} Oct 06 09:45:04 crc kubenswrapper[4610]: I1006 09:45:04.015544 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90388b72d2926330cfb0ff70492add8f8e00a69a0c44b5cbc9658f2674a9fa29" Oct 06 09:45:04 crc kubenswrapper[4610]: I1006 09:45:04.015383 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329065-bgrsg" Oct 06 09:45:04 crc kubenswrapper[4610]: I1006 09:45:04.561160 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2"] Oct 06 09:45:04 crc kubenswrapper[4610]: I1006 09:45:04.568451 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-cv4m2"] Oct 06 09:45:05 crc kubenswrapper[4610]: I1006 09:45:05.083681 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d61c79-991d-4310-be2a-577b04033e43" path="/var/lib/kubelet/pods/85d61c79-991d-4310-be2a-577b04033e43/volumes" Oct 06 09:45:15 crc kubenswrapper[4610]: I1006 09:45:15.071948 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:45:15 crc kubenswrapper[4610]: E1006 09:45:15.072856 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:45:15 crc kubenswrapper[4610]: I1006 09:45:15.572402 4610 scope.go:117] "RemoveContainer" containerID="a72c8fd83747734d80662ad6ce586ffba07a37f7a62f43ce39d6f09c91ce4974" Oct 06 09:45:27 crc kubenswrapper[4610]: I1006 09:45:27.074219 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:45:27 crc kubenswrapper[4610]: E1006 09:45:27.075022 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:45:40 crc kubenswrapper[4610]: I1006 09:45:40.070604 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:45:40 crc kubenswrapper[4610]: E1006 09:45:40.071846 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:45:53 crc kubenswrapper[4610]: I1006 09:45:53.070662 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:45:53 crc kubenswrapper[4610]: E1006 09:45:53.071748 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:46:05 crc kubenswrapper[4610]: I1006 09:46:05.071186 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:46:05 crc kubenswrapper[4610]: E1006 09:46:05.071949 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:46:16 crc kubenswrapper[4610]: I1006 09:46:16.070965 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:46:16 crc kubenswrapper[4610]: E1006 09:46:16.071813 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:46:28 crc kubenswrapper[4610]: I1006 09:46:28.070296 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:46:28 crc kubenswrapper[4610]: E1006 09:46:28.071100 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.387048 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:30 crc kubenswrapper[4610]: E1006 09:46:30.393833 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410c6384-a872-4866-90a6-5b64de86dd39" containerName="collect-profiles" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.393870 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="410c6384-a872-4866-90a6-5b64de86dd39" containerName="collect-profiles" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.394308 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="410c6384-a872-4866-90a6-5b64de86dd39" containerName="collect-profiles" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.397401 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.400271 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.493380 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffgf\" (UniqueName: \"kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.493518 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.493579 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.595718 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffgf\" (UniqueName: \"kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.595843 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.595904 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.596358 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.596972 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.615803 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffgf\" (UniqueName: \"kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf\") pod \"redhat-operators-t5nc9\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:30 crc kubenswrapper[4610]: I1006 09:46:30.751061 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:31 crc kubenswrapper[4610]: I1006 09:46:31.233157 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:31 crc kubenswrapper[4610]: I1006 09:46:31.872920 4610 generic.go:334] "Generic (PLEG): container finished" podID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerID="5e274589a3952f7b84d748be0745a561208da13d121b5aeda5a55070fc8a2ef7" exitCode=0 Oct 06 09:46:31 crc kubenswrapper[4610]: I1006 09:46:31.873012 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerDied","Data":"5e274589a3952f7b84d748be0745a561208da13d121b5aeda5a55070fc8a2ef7"} Oct 06 09:46:31 crc kubenswrapper[4610]: I1006 09:46:31.873285 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerStarted","Data":"5ef48cd772d1f30e690428b381e565c4c4832fc78c45eac9cedf6b824a998830"} Oct 06 09:46:33 crc kubenswrapper[4610]: I1006 09:46:33.896574 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerStarted","Data":"024bf0e17ad883f93d1240cd96d2df33eda8f1d72c88b2324f8574f3c04d015c"} Oct 06 09:46:40 crc kubenswrapper[4610]: I1006 09:46:40.966308 4610 generic.go:334] "Generic (PLEG): container finished" podID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerID="024bf0e17ad883f93d1240cd96d2df33eda8f1d72c88b2324f8574f3c04d015c" exitCode=0 Oct 06 09:46:40 crc kubenswrapper[4610]: I1006 09:46:40.966463 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerDied","Data":"024bf0e17ad883f93d1240cd96d2df33eda8f1d72c88b2324f8574f3c04d015c"} Oct 06 09:46:41 crc kubenswrapper[4610]: I1006 09:46:41.070334 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:46:41 crc kubenswrapper[4610]: E1006 09:46:41.070778 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:46:41 crc kubenswrapper[4610]: I1006 09:46:41.977259 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerStarted","Data":"5fba5659cf082a05ec4d805b145399b0e802f8b553deb3e28142b482abf8c4ee"} Oct 06 09:46:42 crc kubenswrapper[4610]: I1006 09:46:42.004315 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5nc9" podStartSLOduration=2.467548542 podStartE2EDuration="12.004295364s" podCreationTimestamp="2025-10-06 09:46:30 +0000 UTC" firstStartedPulling="2025-10-06 09:46:31.875522589 +0000 UTC m=+3923.590575977" lastFinishedPulling="2025-10-06 09:46:41.412269401 +0000 UTC m=+3933.127322799" observedRunningTime="2025-10-06 09:46:41.997601139 +0000 UTC m=+3933.712654547" watchObservedRunningTime="2025-10-06 09:46:42.004295364 +0000 UTC m=+3933.719348772" Oct 06 09:46:50 crc kubenswrapper[4610]: I1006 09:46:50.751174 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:50 crc kubenswrapper[4610]: I1006 09:46:50.751642 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:50 crc kubenswrapper[4610]: I1006 09:46:50.794497 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:51 crc kubenswrapper[4610]: I1006 09:46:51.158592 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:51 crc kubenswrapper[4610]: I1006 09:46:51.209035 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:53 crc kubenswrapper[4610]: I1006 09:46:53.070229 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:46:53 crc kubenswrapper[4610]: I1006 09:46:53.111763 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5nc9" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="registry-server" containerID="cri-o://5fba5659cf082a05ec4d805b145399b0e802f8b553deb3e28142b482abf8c4ee" gracePeriod=2 Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.125352 4610 generic.go:334] "Generic (PLEG): container finished" podID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerID="5fba5659cf082a05ec4d805b145399b0e802f8b553deb3e28142b482abf8c4ee" exitCode=0 Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.125412 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerDied","Data":"5fba5659cf082a05ec4d805b145399b0e802f8b553deb3e28142b482abf8c4ee"} Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.133743 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7"} Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.416026 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.575712 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ffgf\" (UniqueName: \"kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf\") pod \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.575805 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities\") pod \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.577476 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities" (OuterVolumeSpecName: "utilities") pod "f9b8d90a-45db-40e9-9f7e-189c9d3b533a" (UID: "f9b8d90a-45db-40e9-9f7e-189c9d3b533a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.577665 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content\") pod \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\" (UID: \"f9b8d90a-45db-40e9-9f7e-189c9d3b533a\") " Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.578447 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.585389 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf" (OuterVolumeSpecName: "kube-api-access-2ffgf") pod "f9b8d90a-45db-40e9-9f7e-189c9d3b533a" (UID: "f9b8d90a-45db-40e9-9f7e-189c9d3b533a"). InnerVolumeSpecName "kube-api-access-2ffgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.680815 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ffgf\" (UniqueName: \"kubernetes.io/projected/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-kube-api-access-2ffgf\") on node \"crc\" DevicePath \"\"" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.696955 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9b8d90a-45db-40e9-9f7e-189c9d3b533a" (UID: "f9b8d90a-45db-40e9-9f7e-189c9d3b533a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:46:54 crc kubenswrapper[4610]: I1006 09:46:54.782674 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b8d90a-45db-40e9-9f7e-189c9d3b533a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.146554 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5nc9" event={"ID":"f9b8d90a-45db-40e9-9f7e-189c9d3b533a","Type":"ContainerDied","Data":"5ef48cd772d1f30e690428b381e565c4c4832fc78c45eac9cedf6b824a998830"} Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.146626 4610 scope.go:117] "RemoveContainer" containerID="5fba5659cf082a05ec4d805b145399b0e802f8b553deb3e28142b482abf8c4ee" Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.146841 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5nc9" Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.181918 4610 scope.go:117] "RemoveContainer" containerID="024bf0e17ad883f93d1240cd96d2df33eda8f1d72c88b2324f8574f3c04d015c" Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.182925 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.192314 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5nc9"] Oct 06 09:46:55 crc kubenswrapper[4610]: I1006 09:46:55.504683 4610 scope.go:117] "RemoveContainer" containerID="5e274589a3952f7b84d748be0745a561208da13d121b5aeda5a55070fc8a2ef7" Oct 06 09:46:57 crc kubenswrapper[4610]: I1006 09:46:57.081606 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" path="/var/lib/kubelet/pods/f9b8d90a-45db-40e9-9f7e-189c9d3b533a/volumes" Oct 06 09:49:16 crc kubenswrapper[4610]: I1006 09:49:16.468717 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:49:16 crc kubenswrapper[4610]: I1006 09:49:16.469246 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:49:46 crc kubenswrapper[4610]: I1006 09:49:46.468866 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:49:46 crc kubenswrapper[4610]: I1006 09:49:46.469464 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.643982 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:03 crc kubenswrapper[4610]: E1006 09:50:03.645008 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="extract-utilities" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.645026 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="extract-utilities" Oct 06 09:50:03 crc kubenswrapper[4610]: E1006 09:50:03.645086 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="registry-server" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.645095 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="registry-server" Oct 06 09:50:03 crc kubenswrapper[4610]: E1006 09:50:03.645131 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="extract-content" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.645139 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="extract-content" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.645373 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b8d90a-45db-40e9-9f7e-189c9d3b533a" containerName="registry-server" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.649518 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.670839 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.763450 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.763523 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.763551 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qdlb\" (UniqueName: \"kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.865157 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.865241 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.865274 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qdlb\" (UniqueName: \"kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.865881 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.865886 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.885976 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qdlb\" (UniqueName: \"kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb\") pod \"community-operators-5kcl4\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:03 crc kubenswrapper[4610]: I1006 09:50:03.967318 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:04 crc kubenswrapper[4610]: I1006 09:50:04.542741 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:04 crc kubenswrapper[4610]: I1006 09:50:04.913287 4610 generic.go:334] "Generic (PLEG): container finished" podID="33936d1f-9763-4c95-9095-5879db2be24a" containerID="7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c" exitCode=0 Oct 06 09:50:04 crc kubenswrapper[4610]: I1006 09:50:04.913452 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerDied","Data":"7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c"} Oct 06 09:50:04 crc kubenswrapper[4610]: I1006 09:50:04.913537 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerStarted","Data":"8f2c0d0ffd85e1902fc7a35b24805f607e29ca7837ae57196f752c35743a3080"} Oct 06 09:50:04 crc kubenswrapper[4610]: I1006 09:50:04.915480 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:50:05 crc kubenswrapper[4610]: I1006 09:50:05.924659 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerStarted","Data":"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60"} Oct 06 09:50:07 crc kubenswrapper[4610]: E1006 09:50:07.447576 4610 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33936d1f_9763_4c95_9095_5879db2be24a.slice/crio-conmon-c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:50:07 crc kubenswrapper[4610]: I1006 09:50:07.953745 4610 generic.go:334] "Generic (PLEG): container finished" podID="33936d1f-9763-4c95-9095-5879db2be24a" containerID="c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60" exitCode=0 Oct 06 09:50:07 crc kubenswrapper[4610]: I1006 09:50:07.953794 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerDied","Data":"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60"} Oct 06 09:50:08 crc kubenswrapper[4610]: I1006 09:50:08.968775 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerStarted","Data":"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed"} Oct 06 09:50:08 crc kubenswrapper[4610]: I1006 09:50:08.987320 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5kcl4" podStartSLOduration=2.501858357 podStartE2EDuration="5.987298896s" podCreationTimestamp="2025-10-06 09:50:03 +0000 UTC" firstStartedPulling="2025-10-06 09:50:04.915273667 +0000 UTC m=+4136.630327045" lastFinishedPulling="2025-10-06 09:50:08.400714196 +0000 UTC m=+4140.115767584" observedRunningTime="2025-10-06 09:50:08.983428116 +0000 UTC m=+4140.698481504" watchObservedRunningTime="2025-10-06 09:50:08.987298896 +0000 UTC m=+4140.702352294" Oct 06 09:50:13 crc kubenswrapper[4610]: I1006 09:50:13.967775 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:13 crc kubenswrapper[4610]: I1006 09:50:13.968416 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:15 crc kubenswrapper[4610]: I1006 09:50:15.034270 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5kcl4" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="registry-server" probeResult="failure" output=< Oct 06 09:50:15 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 09:50:15 crc kubenswrapper[4610]: > Oct 06 09:50:16 crc kubenswrapper[4610]: I1006 09:50:16.469256 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:50:16 crc kubenswrapper[4610]: I1006 09:50:16.469317 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:50:16 crc kubenswrapper[4610]: I1006 09:50:16.469366 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:50:16 crc kubenswrapper[4610]: I1006 09:50:16.470144 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:50:16 crc kubenswrapper[4610]: I1006 09:50:16.470211 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7" gracePeriod=600 Oct 06 09:50:17 crc kubenswrapper[4610]: I1006 09:50:17.046597 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7" exitCode=0 Oct 06 09:50:17 crc kubenswrapper[4610]: I1006 09:50:17.046702 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7"} Oct 06 09:50:17 crc kubenswrapper[4610]: I1006 09:50:17.046953 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21"} Oct 06 09:50:17 crc kubenswrapper[4610]: I1006 09:50:17.046980 4610 scope.go:117] "RemoveContainer" containerID="4f83591b6cdb3a3f995fccfb3a2f408308258a31629e18ad893a6c41e1787432" Oct 06 09:50:24 crc kubenswrapper[4610]: I1006 09:50:24.015363 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:24 crc kubenswrapper[4610]: I1006 09:50:24.066344 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:24 crc kubenswrapper[4610]: I1006 09:50:24.252632 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:25 crc kubenswrapper[4610]: I1006 09:50:25.115453 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5kcl4" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="registry-server" containerID="cri-o://1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed" gracePeriod=2 Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.028452 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.127437 4610 generic.go:334] "Generic (PLEG): container finished" podID="33936d1f-9763-4c95-9095-5879db2be24a" containerID="1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed" exitCode=0 Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.127489 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerDied","Data":"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed"} Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.127505 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kcl4" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.128492 4610 scope.go:117] "RemoveContainer" containerID="1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.128406 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kcl4" event={"ID":"33936d1f-9763-4c95-9095-5879db2be24a","Type":"ContainerDied","Data":"8f2c0d0ffd85e1902fc7a35b24805f607e29ca7837ae57196f752c35743a3080"} Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.149549 4610 scope.go:117] "RemoveContainer" containerID="c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.173816 4610 scope.go:117] "RemoveContainer" containerID="7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.175453 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content\") pod \"33936d1f-9763-4c95-9095-5879db2be24a\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.175621 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qdlb\" (UniqueName: \"kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb\") pod \"33936d1f-9763-4c95-9095-5879db2be24a\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.175691 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities\") pod \"33936d1f-9763-4c95-9095-5879db2be24a\" (UID: \"33936d1f-9763-4c95-9095-5879db2be24a\") " Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.177747 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities" (OuterVolumeSpecName: "utilities") pod "33936d1f-9763-4c95-9095-5879db2be24a" (UID: "33936d1f-9763-4c95-9095-5879db2be24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.183111 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb" (OuterVolumeSpecName: "kube-api-access-4qdlb") pod "33936d1f-9763-4c95-9095-5879db2be24a" (UID: "33936d1f-9763-4c95-9095-5879db2be24a"). InnerVolumeSpecName "kube-api-access-4qdlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.248699 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33936d1f-9763-4c95-9095-5879db2be24a" (UID: "33936d1f-9763-4c95-9095-5879db2be24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.260564 4610 scope.go:117] "RemoveContainer" containerID="1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed" Oct 06 09:50:26 crc kubenswrapper[4610]: E1006 09:50:26.261531 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed\": container with ID starting with 1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed not found: ID does not exist" containerID="1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.261562 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed"} err="failed to get container status \"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed\": rpc error: code = NotFound desc = could not find container \"1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed\": container with ID starting with 1d6d7633001841ce5581d78a49bee22fee18607c948bf0759475ea7ac16fa1ed not found: ID does not exist" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.261582 4610 scope.go:117] "RemoveContainer" containerID="c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60" Oct 06 09:50:26 crc kubenswrapper[4610]: E1006 09:50:26.261938 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60\": container with ID starting with c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60 not found: ID does not exist" containerID="c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.261968 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60"} err="failed to get container status \"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60\": rpc error: code = NotFound desc = could not find container \"c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60\": container with ID starting with c711a27795ebe0690351fbbb7e328f19962c7232d7b25896ce3739889d3f2f60 not found: ID does not exist" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.261982 4610 scope.go:117] "RemoveContainer" containerID="7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c" Oct 06 09:50:26 crc kubenswrapper[4610]: E1006 09:50:26.262319 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c\": container with ID starting with 7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c not found: ID does not exist" containerID="7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.262457 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c"} err="failed to get container status \"7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c\": rpc error: code = NotFound desc = could not find container \"7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c\": container with ID starting with 7fb76b5707d6336f71895e4062febc4f29dc4ae8afd2c46e518536e5e8979f1c not found: ID does not exist" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.278144 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qdlb\" (UniqueName: \"kubernetes.io/projected/33936d1f-9763-4c95-9095-5879db2be24a-kube-api-access-4qdlb\") on node \"crc\" DevicePath \"\"" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.278464 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.278567 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33936d1f-9763-4c95-9095-5879db2be24a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.461254 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:26 crc kubenswrapper[4610]: I1006 09:50:26.468152 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5kcl4"] Oct 06 09:50:27 crc kubenswrapper[4610]: I1006 09:50:27.080602 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33936d1f-9763-4c95-9095-5879db2be24a" path="/var/lib/kubelet/pods/33936d1f-9763-4c95-9095-5879db2be24a/volumes" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.575033 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:13 crc kubenswrapper[4610]: E1006 09:51:13.575935 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="registry-server" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.575948 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="registry-server" Oct 06 09:51:13 crc kubenswrapper[4610]: E1006 09:51:13.575971 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="extract-utilities" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.575978 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="extract-utilities" Oct 06 09:51:13 crc kubenswrapper[4610]: E1006 09:51:13.575994 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="extract-content" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.576000 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="extract-content" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.576186 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="33936d1f-9763-4c95-9095-5879db2be24a" containerName="registry-server" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.578082 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.609842 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.664570 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.664714 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdxrq\" (UniqueName: \"kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.664747 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.765453 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.765557 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdxrq\" (UniqueName: \"kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.765586 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.765904 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.765959 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.787121 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdxrq\" (UniqueName: \"kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq\") pod \"redhat-marketplace-sfnmq\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:13 crc kubenswrapper[4610]: I1006 09:51:13.912186 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:14 crc kubenswrapper[4610]: I1006 09:51:14.400729 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:14 crc kubenswrapper[4610]: I1006 09:51:14.589655 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerStarted","Data":"6284e7581a194322573b1a8df4e61d7d25b9b5c1e7e01bb4a11b747a6350aee5"} Oct 06 09:51:15 crc kubenswrapper[4610]: I1006 09:51:15.601108 4610 generic.go:334] "Generic (PLEG): container finished" podID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerID="70c72f18a0869457e7e9f95109aee52b666554f78d7de547932ce1307f4ed9ef" exitCode=0 Oct 06 09:51:15 crc kubenswrapper[4610]: I1006 09:51:15.601190 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerDied","Data":"70c72f18a0869457e7e9f95109aee52b666554f78d7de547932ce1307f4ed9ef"} Oct 06 09:51:17 crc kubenswrapper[4610]: I1006 09:51:17.617825 4610 generic.go:334] "Generic (PLEG): container finished" podID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerID="52d358253ed6727081214690c58a23e3ab74c4c85b8f70b15f9879ad7a8f6e6d" exitCode=0 Oct 06 09:51:17 crc kubenswrapper[4610]: I1006 09:51:17.617871 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerDied","Data":"52d358253ed6727081214690c58a23e3ab74c4c85b8f70b15f9879ad7a8f6e6d"} Oct 06 09:51:18 crc kubenswrapper[4610]: I1006 09:51:18.631493 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerStarted","Data":"a89956d627bee55629498dc38a0f0e471872471194de7e6f433d1fb94f0936b8"} Oct 06 09:51:18 crc kubenswrapper[4610]: I1006 09:51:18.658694 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfnmq" podStartSLOduration=3.277785481 podStartE2EDuration="5.65867304s" podCreationTimestamp="2025-10-06 09:51:13 +0000 UTC" firstStartedPulling="2025-10-06 09:51:15.604460965 +0000 UTC m=+4207.319514353" lastFinishedPulling="2025-10-06 09:51:17.985348524 +0000 UTC m=+4209.700401912" observedRunningTime="2025-10-06 09:51:18.649546614 +0000 UTC m=+4210.364600022" watchObservedRunningTime="2025-10-06 09:51:18.65867304 +0000 UTC m=+4210.373726448" Oct 06 09:51:23 crc kubenswrapper[4610]: I1006 09:51:23.912651 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:23 crc kubenswrapper[4610]: I1006 09:51:23.914350 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:23 crc kubenswrapper[4610]: I1006 09:51:23.988924 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:24 crc kubenswrapper[4610]: I1006 09:51:24.746712 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:24 crc kubenswrapper[4610]: I1006 09:51:24.804712 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:26 crc kubenswrapper[4610]: I1006 09:51:26.705194 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfnmq" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="registry-server" containerID="cri-o://a89956d627bee55629498dc38a0f0e471872471194de7e6f433d1fb94f0936b8" gracePeriod=2 Oct 06 09:51:27 crc kubenswrapper[4610]: I1006 09:51:27.719766 4610 generic.go:334] "Generic (PLEG): container finished" podID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerID="a89956d627bee55629498dc38a0f0e471872471194de7e6f433d1fb94f0936b8" exitCode=0 Oct 06 09:51:27 crc kubenswrapper[4610]: I1006 09:51:27.719841 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerDied","Data":"a89956d627bee55629498dc38a0f0e471872471194de7e6f433d1fb94f0936b8"} Oct 06 09:51:27 crc kubenswrapper[4610]: I1006 09:51:27.896438 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.083881 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities\") pod \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.084198 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdxrq\" (UniqueName: \"kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq\") pod \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.084458 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content\") pod \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\" (UID: \"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe\") " Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.084795 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities" (OuterVolumeSpecName: "utilities") pod "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" (UID: "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.085427 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.096180 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq" (OuterVolumeSpecName: "kube-api-access-pdxrq") pod "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" (UID: "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe"). InnerVolumeSpecName "kube-api-access-pdxrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.103001 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" (UID: "d9b60ad6-5a42-40cd-88b0-d7c01a1410fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.187202 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdxrq\" (UniqueName: \"kubernetes.io/projected/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-kube-api-access-pdxrq\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.187239 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.733325 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfnmq" event={"ID":"d9b60ad6-5a42-40cd-88b0-d7c01a1410fe","Type":"ContainerDied","Data":"6284e7581a194322573b1a8df4e61d7d25b9b5c1e7e01bb4a11b747a6350aee5"} Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.733442 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfnmq" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.733688 4610 scope.go:117] "RemoveContainer" containerID="a89956d627bee55629498dc38a0f0e471872471194de7e6f433d1fb94f0936b8" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.758953 4610 scope.go:117] "RemoveContainer" containerID="52d358253ed6727081214690c58a23e3ab74c4c85b8f70b15f9879ad7a8f6e6d" Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.778605 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:28 crc kubenswrapper[4610]: I1006 09:51:28.786694 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfnmq"] Oct 06 09:51:29 crc kubenswrapper[4610]: I1006 09:51:29.086888 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" path="/var/lib/kubelet/pods/d9b60ad6-5a42-40cd-88b0-d7c01a1410fe/volumes" Oct 06 09:51:29 crc kubenswrapper[4610]: I1006 09:51:29.118680 4610 scope.go:117] "RemoveContainer" containerID="70c72f18a0869457e7e9f95109aee52b666554f78d7de547932ce1307f4ed9ef" Oct 06 09:52:16 crc kubenswrapper[4610]: I1006 09:52:16.469372 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:52:16 crc kubenswrapper[4610]: I1006 09:52:16.469951 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:52:46 crc kubenswrapper[4610]: I1006 09:52:46.469688 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:52:46 crc kubenswrapper[4610]: I1006 09:52:46.470618 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.469420 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.469993 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.470063 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.470858 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.470924 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" gracePeriod=600 Oct 06 09:53:16 crc kubenswrapper[4610]: E1006 09:53:16.664736 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.779569 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" exitCode=0 Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.779616 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21"} Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.779654 4610 scope.go:117] "RemoveContainer" containerID="66901af95c9533a46c6d3bb2dbab4896780eb28dec7cd15cbc9eeacd28bd9eb7" Oct 06 09:53:16 crc kubenswrapper[4610]: I1006 09:53:16.780145 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:53:16 crc kubenswrapper[4610]: E1006 09:53:16.780519 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:53:31 crc kubenswrapper[4610]: I1006 09:53:31.071792 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:53:31 crc kubenswrapper[4610]: E1006 09:53:31.073137 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:53:46 crc kubenswrapper[4610]: I1006 09:53:46.070829 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:53:46 crc kubenswrapper[4610]: E1006 09:53:46.071725 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:53:57 crc kubenswrapper[4610]: I1006 09:53:57.070255 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:53:57 crc kubenswrapper[4610]: E1006 09:53:57.070915 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:54:12 crc kubenswrapper[4610]: I1006 09:54:12.070892 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:54:12 crc kubenswrapper[4610]: E1006 09:54:12.071804 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:54:27 crc kubenswrapper[4610]: I1006 09:54:27.070915 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:54:27 crc kubenswrapper[4610]: E1006 09:54:27.071780 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:54:41 crc kubenswrapper[4610]: I1006 09:54:41.070646 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:54:41 crc kubenswrapper[4610]: E1006 09:54:41.071432 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:54:52 crc kubenswrapper[4610]: I1006 09:54:52.069992 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:54:52 crc kubenswrapper[4610]: E1006 09:54:52.072279 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:55:06 crc kubenswrapper[4610]: I1006 09:55:06.070503 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:55:06 crc kubenswrapper[4610]: E1006 09:55:06.072708 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.470231 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:10 crc kubenswrapper[4610]: E1006 09:55:10.471205 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="extract-utilities" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.471222 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="extract-utilities" Oct 06 09:55:10 crc kubenswrapper[4610]: E1006 09:55:10.471237 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="extract-content" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.471245 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="extract-content" Oct 06 09:55:10 crc kubenswrapper[4610]: E1006 09:55:10.471259 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="registry-server" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.471267 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="registry-server" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.471494 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b60ad6-5a42-40cd-88b0-d7c01a1410fe" containerName="registry-server" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.473158 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.488277 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.516219 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.516555 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.516686 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrg5\" (UniqueName: \"kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.617970 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.618315 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.618409 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.618538 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrg5\" (UniqueName: \"kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.618710 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.641872 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrg5\" (UniqueName: \"kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5\") pod \"certified-operators-224w2\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:10 crc kubenswrapper[4610]: I1006 09:55:10.789101 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:11 crc kubenswrapper[4610]: I1006 09:55:11.375809 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:11 crc kubenswrapper[4610]: I1006 09:55:11.902544 4610 generic.go:334] "Generic (PLEG): container finished" podID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerID="f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1" exitCode=0 Oct 06 09:55:11 crc kubenswrapper[4610]: I1006 09:55:11.902650 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerDied","Data":"f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1"} Oct 06 09:55:11 crc kubenswrapper[4610]: I1006 09:55:11.904465 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerStarted","Data":"66f2c00f08c0dab271a4be22c5a8da638ffb2fd2df5fcabb768c6eff1ae4de9d"} Oct 06 09:55:11 crc kubenswrapper[4610]: I1006 09:55:11.904750 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:55:13 crc kubenswrapper[4610]: I1006 09:55:13.920494 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerStarted","Data":"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43"} Oct 06 09:55:14 crc kubenswrapper[4610]: I1006 09:55:14.935511 4610 generic.go:334] "Generic (PLEG): container finished" podID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerID="60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43" exitCode=0 Oct 06 09:55:14 crc kubenswrapper[4610]: I1006 09:55:14.935583 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerDied","Data":"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43"} Oct 06 09:55:17 crc kubenswrapper[4610]: I1006 09:55:17.993114 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerStarted","Data":"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c"} Oct 06 09:55:18 crc kubenswrapper[4610]: I1006 09:55:18.019818 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-224w2" podStartSLOduration=3.687361686 podStartE2EDuration="8.019796477s" podCreationTimestamp="2025-10-06 09:55:10 +0000 UTC" firstStartedPulling="2025-10-06 09:55:11.904508492 +0000 UTC m=+4443.619561880" lastFinishedPulling="2025-10-06 09:55:16.236943283 +0000 UTC m=+4447.951996671" observedRunningTime="2025-10-06 09:55:18.009319137 +0000 UTC m=+4449.724372535" watchObservedRunningTime="2025-10-06 09:55:18.019796477 +0000 UTC m=+4449.734849865" Oct 06 09:55:19 crc kubenswrapper[4610]: I1006 09:55:19.078345 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:55:19 crc kubenswrapper[4610]: E1006 09:55:19.078961 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:55:20 crc kubenswrapper[4610]: I1006 09:55:20.790146 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:20 crc kubenswrapper[4610]: I1006 09:55:20.791463 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:20 crc kubenswrapper[4610]: I1006 09:55:20.847605 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:22 crc kubenswrapper[4610]: I1006 09:55:22.430023 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:22 crc kubenswrapper[4610]: I1006 09:55:22.480346 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.042113 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-224w2" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="registry-server" containerID="cri-o://2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c" gracePeriod=2 Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.599490 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.696507 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrg5\" (UniqueName: \"kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5\") pod \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.696598 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content\") pod \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.696935 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities\") pod \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\" (UID: \"42cbdfe7-d4de-4e95-adcf-32bbc4550366\") " Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.698271 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities" (OuterVolumeSpecName: "utilities") pod "42cbdfe7-d4de-4e95-adcf-32bbc4550366" (UID: "42cbdfe7-d4de-4e95-adcf-32bbc4550366"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.698980 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.715359 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5" (OuterVolumeSpecName: "kube-api-access-qjrg5") pod "42cbdfe7-d4de-4e95-adcf-32bbc4550366" (UID: "42cbdfe7-d4de-4e95-adcf-32bbc4550366"). InnerVolumeSpecName "kube-api-access-qjrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.748020 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42cbdfe7-d4de-4e95-adcf-32bbc4550366" (UID: "42cbdfe7-d4de-4e95-adcf-32bbc4550366"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.801248 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrg5\" (UniqueName: \"kubernetes.io/projected/42cbdfe7-d4de-4e95-adcf-32bbc4550366-kube-api-access-qjrg5\") on node \"crc\" DevicePath \"\"" Oct 06 09:55:24 crc kubenswrapper[4610]: I1006 09:55:24.801289 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42cbdfe7-d4de-4e95-adcf-32bbc4550366-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.052907 4610 generic.go:334] "Generic (PLEG): container finished" podID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerID="2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c" exitCode=0 Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.052956 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerDied","Data":"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c"} Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.052968 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-224w2" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.052993 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-224w2" event={"ID":"42cbdfe7-d4de-4e95-adcf-32bbc4550366","Type":"ContainerDied","Data":"66f2c00f08c0dab271a4be22c5a8da638ffb2fd2df5fcabb768c6eff1ae4de9d"} Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.053017 4610 scope.go:117] "RemoveContainer" containerID="2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.086396 4610 scope.go:117] "RemoveContainer" containerID="60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.094078 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.125654 4610 scope.go:117] "RemoveContainer" containerID="f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.125895 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-224w2"] Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.178864 4610 scope.go:117] "RemoveContainer" containerID="2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c" Oct 06 09:55:25 crc kubenswrapper[4610]: E1006 09:55:25.180463 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c\": container with ID starting with 2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c not found: ID does not exist" containerID="2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.180500 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c"} err="failed to get container status \"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c\": rpc error: code = NotFound desc = could not find container \"2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c\": container with ID starting with 2cc7a7e465b9cc5de38dda4471ba430349e14c33d1045f261b1b01d1b45a2e2c not found: ID does not exist" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.180521 4610 scope.go:117] "RemoveContainer" containerID="60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43" Oct 06 09:55:25 crc kubenswrapper[4610]: E1006 09:55:25.181132 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43\": container with ID starting with 60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43 not found: ID does not exist" containerID="60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.181153 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43"} err="failed to get container status \"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43\": rpc error: code = NotFound desc = could not find container \"60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43\": container with ID starting with 60e4d8d37c6d0d8cbe39d5e562a6cb266d60ce3d3f3cb7ccbd1c27666e731e43 not found: ID does not exist" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.181166 4610 scope.go:117] "RemoveContainer" containerID="f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1" Oct 06 09:55:25 crc kubenswrapper[4610]: E1006 09:55:25.181748 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1\": container with ID starting with f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1 not found: ID does not exist" containerID="f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1" Oct 06 09:55:25 crc kubenswrapper[4610]: I1006 09:55:25.181767 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1"} err="failed to get container status \"f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1\": rpc error: code = NotFound desc = could not find container \"f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1\": container with ID starting with f0ea9239a8c1415b81f33ff80d2c7feb2f33d2be5b6a2a82bb90415f506aafb1 not found: ID does not exist" Oct 06 09:55:27 crc kubenswrapper[4610]: I1006 09:55:27.081877 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" path="/var/lib/kubelet/pods/42cbdfe7-d4de-4e95-adcf-32bbc4550366/volumes" Oct 06 09:55:30 crc kubenswrapper[4610]: I1006 09:55:30.071005 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:55:30 crc kubenswrapper[4610]: E1006 09:55:30.071508 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:55:45 crc kubenswrapper[4610]: I1006 09:55:45.071452 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:55:45 crc kubenswrapper[4610]: E1006 09:55:45.072094 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:55:56 crc kubenswrapper[4610]: I1006 09:55:56.070305 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:55:56 crc kubenswrapper[4610]: E1006 09:55:56.071034 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:56:08 crc kubenswrapper[4610]: I1006 09:56:08.070328 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:56:08 crc kubenswrapper[4610]: E1006 09:56:08.070926 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:56:23 crc kubenswrapper[4610]: I1006 09:56:23.070496 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:56:23 crc kubenswrapper[4610]: E1006 09:56:23.071562 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:56:37 crc kubenswrapper[4610]: I1006 09:56:37.071109 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:56:37 crc kubenswrapper[4610]: E1006 09:56:37.071907 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:56:48 crc kubenswrapper[4610]: I1006 09:56:48.070618 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:56:48 crc kubenswrapper[4610]: E1006 09:56:48.071890 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:56:59 crc kubenswrapper[4610]: I1006 09:56:59.081772 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:56:59 crc kubenswrapper[4610]: E1006 09:56:59.082983 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:57:12 crc kubenswrapper[4610]: I1006 09:57:12.071359 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:57:12 crc kubenswrapper[4610]: E1006 09:57:12.072438 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:57:26 crc kubenswrapper[4610]: I1006 09:57:26.070461 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:57:26 crc kubenswrapper[4610]: E1006 09:57:26.071228 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:57:39 crc kubenswrapper[4610]: I1006 09:57:39.080194 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:57:39 crc kubenswrapper[4610]: E1006 09:57:39.080963 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:57:54 crc kubenswrapper[4610]: I1006 09:57:54.070550 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:57:54 crc kubenswrapper[4610]: E1006 09:57:54.071370 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:58:09 crc kubenswrapper[4610]: I1006 09:58:09.078475 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:58:09 crc kubenswrapper[4610]: E1006 09:58:09.080422 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 09:58:21 crc kubenswrapper[4610]: I1006 09:58:21.070915 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 09:58:21 crc kubenswrapper[4610]: I1006 09:58:21.886230 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6"} Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.184923 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg"] Oct 06 10:00:00 crc kubenswrapper[4610]: E1006 10:00:00.199701 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="registry-server" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.199726 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="registry-server" Oct 06 10:00:00 crc kubenswrapper[4610]: E1006 10:00:00.199740 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="extract-utilities" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.199749 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="extract-utilities" Oct 06 10:00:00 crc kubenswrapper[4610]: E1006 10:00:00.199772 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="extract-content" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.199780 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="extract-content" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.200083 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cbdfe7-d4de-4e95-adcf-32bbc4550366" containerName="registry-server" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.200966 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg"] Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.201104 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.205738 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.205794 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.321017 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.321146 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.321239 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snqx7\" (UniqueName: \"kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.427137 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snqx7\" (UniqueName: \"kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.427229 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.427292 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.428233 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.442548 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.460705 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snqx7\" (UniqueName: \"kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7\") pod \"collect-profiles-29329080-kc9dg\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.527088 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:00 crc kubenswrapper[4610]: I1006 10:00:00.994348 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg"] Oct 06 10:00:01 crc kubenswrapper[4610]: W1006 10:00:01.027680 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4c4056_b5d2_4ab5_a019_e1d4e3d01e96.slice/crio-4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054 WatchSource:0}: Error finding container 4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054: Status 404 returned error can't find the container with id 4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054 Oct 06 10:00:01 crc kubenswrapper[4610]: I1006 10:00:01.997996 4610 generic.go:334] "Generic (PLEG): container finished" podID="7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" containerID="7984b688c46f210109744fe8ceddaee3a1ecfb4ab7cc3ad31f8b8d465b60cee7" exitCode=0 Oct 06 10:00:01 crc kubenswrapper[4610]: I1006 10:00:01.998117 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" event={"ID":"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96","Type":"ContainerDied","Data":"7984b688c46f210109744fe8ceddaee3a1ecfb4ab7cc3ad31f8b8d465b60cee7"} Oct 06 10:00:01 crc kubenswrapper[4610]: I1006 10:00:01.998334 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" event={"ID":"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96","Type":"ContainerStarted","Data":"4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054"} Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.392122 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.504860 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume\") pod \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.505249 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume\") pod \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.505276 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snqx7\" (UniqueName: \"kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7\") pod \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\" (UID: \"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96\") " Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.506720 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" (UID: "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.512856 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7" (OuterVolumeSpecName: "kube-api-access-snqx7") pod "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" (UID: "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96"). InnerVolumeSpecName "kube-api-access-snqx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.521676 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" (UID: "7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.607079 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.607118 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:03 crc kubenswrapper[4610]: I1006 10:00:03.607131 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snqx7\" (UniqueName: \"kubernetes.io/projected/7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96-kube-api-access-snqx7\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:04 crc kubenswrapper[4610]: I1006 10:00:04.016124 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" event={"ID":"7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96","Type":"ContainerDied","Data":"4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054"} Oct 06 10:00:04 crc kubenswrapper[4610]: I1006 10:00:04.016179 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c222f11964cb9fabba3d4950f5ac5e50bba5725c19bb905712b996361ba4054" Oct 06 10:00:04 crc kubenswrapper[4610]: I1006 10:00:04.016255 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329080-kc9dg" Oct 06 10:00:04 crc kubenswrapper[4610]: I1006 10:00:04.506023 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6"] Oct 06 10:00:04 crc kubenswrapper[4610]: I1006 10:00:04.515937 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329035-l7sc6"] Oct 06 10:00:05 crc kubenswrapper[4610]: I1006 10:00:05.093014 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b83115b-6506-4f12-8b07-a850a423dc9b" path="/var/lib/kubelet/pods/6b83115b-6506-4f12-8b07-a850a423dc9b/volumes" Oct 06 10:00:15 crc kubenswrapper[4610]: I1006 10:00:15.939402 4610 scope.go:117] "RemoveContainer" containerID="4c2990495f0d3f8012ae321831baa99006b91b0817fbcb21760344ae7c9cdfbe" Oct 06 10:00:46 crc kubenswrapper[4610]: I1006 10:00:46.468826 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:00:46 crc kubenswrapper[4610]: I1006 10:00:46.469304 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.547877 4610 generic.go:334] "Generic (PLEG): container finished" podID="6effef24-402a-46e6-a15a-02815ef810ae" containerID="3857536c327114a3ff5db3acd0cbe8b95622569cc2358a5018afbc272436d061" exitCode=0 Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.547923 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6effef24-402a-46e6-a15a-02815ef810ae","Type":"ContainerDied","Data":"3857536c327114a3ff5db3acd0cbe8b95622569cc2358a5018afbc272436d061"} Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.760192 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:00:54 crc kubenswrapper[4610]: E1006 10:00:54.760746 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" containerName="collect-profiles" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.760765 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" containerName="collect-profiles" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.760964 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4c4056-b5d2-4ab5-a019-e1d4e3d01e96" containerName="collect-profiles" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.765595 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.783007 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.865025 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.865094 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.865137 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p5f\" (UniqueName: \"kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.966867 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.966927 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.967019 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p5f\" (UniqueName: \"kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.967468 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:54 crc kubenswrapper[4610]: I1006 10:00:54.967483 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:55 crc kubenswrapper[4610]: I1006 10:00:55.290102 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p5f\" (UniqueName: \"kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f\") pod \"community-operators-jtxck\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:55 crc kubenswrapper[4610]: I1006 10:00:55.382487 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:00:55 crc kubenswrapper[4610]: I1006 10:00:55.881876 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.038775 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.186893 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187219 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187320 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187393 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187465 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187494 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187570 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfx6s\" (UniqueName: \"kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187608 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.187636 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary\") pod \"6effef24-402a-46e6-a15a-02815ef810ae\" (UID: \"6effef24-402a-46e6-a15a-02815ef810ae\") " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.188698 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data" (OuterVolumeSpecName: "config-data") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.189129 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.192473 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.195544 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.201214 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s" (OuterVolumeSpecName: "kube-api-access-jfx6s") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "kube-api-access-jfx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.223824 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.225391 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.226566 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.245015 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6effef24-402a-46e6-a15a-02815ef810ae" (UID: "6effef24-402a-46e6-a15a-02815ef810ae"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290322 4610 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290388 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290404 4610 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290417 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfx6s\" (UniqueName: \"kubernetes.io/projected/6effef24-402a-46e6-a15a-02815ef810ae-kube-api-access-jfx6s\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290430 4610 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290443 4610 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6effef24-402a-46e6-a15a-02815ef810ae-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290455 4610 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6effef24-402a-46e6-a15a-02815ef810ae-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290470 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.290482 4610 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6effef24-402a-46e6-a15a-02815ef810ae-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.325398 4610 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.392186 4610 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.582859 4610 generic.go:334] "Generic (PLEG): container finished" podID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerID="0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078" exitCode=0 Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.582951 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerDied","Data":"0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078"} Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.582995 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerStarted","Data":"ba75a0df952dfbc4a6af04ab5bd22638697e1de8148f34dd5dfdfa2c0ea0267e"} Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.585210 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6effef24-402a-46e6-a15a-02815ef810ae","Type":"ContainerDied","Data":"d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e"} Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.585252 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53b6b0bed7d42016506090d754a295c0566054219284ea11272542ab99fb26e" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.585299 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 10:00:56 crc kubenswrapper[4610]: I1006 10:00:56.585344 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 10:00:57 crc kubenswrapper[4610]: I1006 10:00:57.599068 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerStarted","Data":"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0"} Oct 06 10:00:58 crc kubenswrapper[4610]: I1006 10:00:58.609851 4610 generic.go:334] "Generic (PLEG): container finished" podID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerID="b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0" exitCode=0 Oct 06 10:00:58 crc kubenswrapper[4610]: I1006 10:00:58.609948 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerDied","Data":"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0"} Oct 06 10:00:59 crc kubenswrapper[4610]: I1006 10:00:59.621411 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerStarted","Data":"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800"} Oct 06 10:00:59 crc kubenswrapper[4610]: I1006 10:00:59.638752 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jtxck" podStartSLOduration=3.151479715 podStartE2EDuration="5.638740475s" podCreationTimestamp="2025-10-06 10:00:54 +0000 UTC" firstStartedPulling="2025-10-06 10:00:56.584916347 +0000 UTC m=+4788.299969765" lastFinishedPulling="2025-10-06 10:00:59.072177137 +0000 UTC m=+4790.787230525" observedRunningTime="2025-10-06 10:00:59.637919083 +0000 UTC m=+4791.352972471" watchObservedRunningTime="2025-10-06 10:00:59.638740475 +0000 UTC m=+4791.353793863" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.177556 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329081-rlm67"] Oct 06 10:01:00 crc kubenswrapper[4610]: E1006 10:01:00.177944 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6effef24-402a-46e6-a15a-02815ef810ae" containerName="tempest-tests-tempest-tests-runner" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.177963 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6effef24-402a-46e6-a15a-02815ef810ae" containerName="tempest-tests-tempest-tests-runner" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.178266 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6effef24-402a-46e6-a15a-02815ef810ae" containerName="tempest-tests-tempest-tests-runner" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.179495 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.208911 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329081-rlm67"] Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.273053 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.273165 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.273202 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.273265 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwmp\" (UniqueName: \"kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.374620 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwmp\" (UniqueName: \"kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.374678 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.374766 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.374792 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.383784 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.384462 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.394861 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.397874 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwmp\" (UniqueName: \"kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp\") pod \"keystone-cron-29329081-rlm67\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:00 crc kubenswrapper[4610]: I1006 10:01:00.539294 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:01 crc kubenswrapper[4610]: I1006 10:01:01.115633 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329081-rlm67"] Oct 06 10:01:01 crc kubenswrapper[4610]: I1006 10:01:01.643441 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329081-rlm67" event={"ID":"06b1f441-39d4-4c07-8696-045031364dd2","Type":"ContainerStarted","Data":"658daa4670d391368c12322bd364ae07773112ad28af3412bae14ef477f1ca2a"} Oct 06 10:01:01 crc kubenswrapper[4610]: I1006 10:01:01.643703 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329081-rlm67" event={"ID":"06b1f441-39d4-4c07-8696-045031364dd2","Type":"ContainerStarted","Data":"a9ab2a4400372d2af2cf785adf08a1cd35e2c2486637ea2d7d28b519de0ccb1b"} Oct 06 10:01:01 crc kubenswrapper[4610]: I1006 10:01:01.670533 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329081-rlm67" podStartSLOduration=1.670516613 podStartE2EDuration="1.670516613s" podCreationTimestamp="2025-10-06 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 10:01:01.664434543 +0000 UTC m=+4793.379487931" watchObservedRunningTime="2025-10-06 10:01:01.670516613 +0000 UTC m=+4793.385570001" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.384138 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.386250 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.392225 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.394262 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.396279 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bpqlc" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.407989 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.507546 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.591308 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.591663 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqww6\" (UniqueName: \"kubernetes.io/projected/9a73cfa7-ef0b-4dda-9ca4-de80de751a61-kube-api-access-vqww6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.688286 4610 generic.go:334] "Generic (PLEG): container finished" podID="06b1f441-39d4-4c07-8696-045031364dd2" containerID="658daa4670d391368c12322bd364ae07773112ad28af3412bae14ef477f1ca2a" exitCode=0 Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.688585 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329081-rlm67" event={"ID":"06b1f441-39d4-4c07-8696-045031364dd2","Type":"ContainerDied","Data":"658daa4670d391368c12322bd364ae07773112ad28af3412bae14ef477f1ca2a"} Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.693680 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.693784 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqww6\" (UniqueName: \"kubernetes.io/projected/9a73cfa7-ef0b-4dda-9ca4-de80de751a61-kube-api-access-vqww6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.695125 4610 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:05 crc kubenswrapper[4610]: I1006 10:01:05.989384 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqww6\" (UniqueName: \"kubernetes.io/projected/9a73cfa7-ef0b-4dda-9ca4-de80de751a61-kube-api-access-vqww6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.137674 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9a73cfa7-ef0b-4dda-9ca4-de80de751a61\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.144214 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.199530 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.384278 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.884749 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 10:01:06 crc kubenswrapper[4610]: W1006 10:01:06.885939 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a73cfa7_ef0b_4dda_9ca4_de80de751a61.slice/crio-9e41b3839e76720fe8016b5971e1b2723fe7884f5523057658ece3d6d0a4e3d1 WatchSource:0}: Error finding container 9e41b3839e76720fe8016b5971e1b2723fe7884f5523057658ece3d6d0a4e3d1: Status 404 returned error can't find the container with id 9e41b3839e76720fe8016b5971e1b2723fe7884f5523057658ece3d6d0a4e3d1 Oct 06 10:01:06 crc kubenswrapper[4610]: I1006 10:01:06.955482 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.127230 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data\") pod \"06b1f441-39d4-4c07-8696-045031364dd2\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.127366 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys\") pod \"06b1f441-39d4-4c07-8696-045031364dd2\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.127396 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwmp\" (UniqueName: \"kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp\") pod \"06b1f441-39d4-4c07-8696-045031364dd2\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.127419 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle\") pod \"06b1f441-39d4-4c07-8696-045031364dd2\" (UID: \"06b1f441-39d4-4c07-8696-045031364dd2\") " Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.691361 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "06b1f441-39d4-4c07-8696-045031364dd2" (UID: "06b1f441-39d4-4c07-8696-045031364dd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.713240 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp" (OuterVolumeSpecName: "kube-api-access-gfwmp") pod "06b1f441-39d4-4c07-8696-045031364dd2" (UID: "06b1f441-39d4-4c07-8696-045031364dd2"). InnerVolumeSpecName "kube-api-access-gfwmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.739222 4610 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.739252 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfwmp\" (UniqueName: \"kubernetes.io/projected/06b1f441-39d4-4c07-8696-045031364dd2-kube-api-access-gfwmp\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.751396 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a73cfa7-ef0b-4dda-9ca4-de80de751a61","Type":"ContainerStarted","Data":"9e41b3839e76720fe8016b5971e1b2723fe7884f5523057658ece3d6d0a4e3d1"} Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.782169 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329081-rlm67" event={"ID":"06b1f441-39d4-4c07-8696-045031364dd2","Type":"ContainerDied","Data":"a9ab2a4400372d2af2cf785adf08a1cd35e2c2486637ea2d7d28b519de0ccb1b"} Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.782211 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ab2a4400372d2af2cf785adf08a1cd35e2c2486637ea2d7d28b519de0ccb1b" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.782274 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329081-rlm67" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.782279 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jtxck" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="registry-server" containerID="cri-o://6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800" gracePeriod=2 Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.811200 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data" (OuterVolumeSpecName: "config-data") pod "06b1f441-39d4-4c07-8696-045031364dd2" (UID: "06b1f441-39d4-4c07-8696-045031364dd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.845490 4610 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.877170 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b1f441-39d4-4c07-8696-045031364dd2" (UID: "06b1f441-39d4-4c07-8696-045031364dd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:01:07 crc kubenswrapper[4610]: I1006 10:01:07.947681 4610 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b1f441-39d4-4c07-8696-045031364dd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.386682 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.457671 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content\") pod \"f20e5484-d50d-4f65-aea6-720b3706e2ff\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.458150 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities\") pod \"f20e5484-d50d-4f65-aea6-720b3706e2ff\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.458241 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78p5f\" (UniqueName: \"kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f\") pod \"f20e5484-d50d-4f65-aea6-720b3706e2ff\" (UID: \"f20e5484-d50d-4f65-aea6-720b3706e2ff\") " Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.460456 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities" (OuterVolumeSpecName: "utilities") pod "f20e5484-d50d-4f65-aea6-720b3706e2ff" (UID: "f20e5484-d50d-4f65-aea6-720b3706e2ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.464426 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f" (OuterVolumeSpecName: "kube-api-access-78p5f") pod "f20e5484-d50d-4f65-aea6-720b3706e2ff" (UID: "f20e5484-d50d-4f65-aea6-720b3706e2ff"). InnerVolumeSpecName "kube-api-access-78p5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.511562 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20e5484-d50d-4f65-aea6-720b3706e2ff" (UID: "f20e5484-d50d-4f65-aea6-720b3706e2ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.561018 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.561086 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78p5f\" (UniqueName: \"kubernetes.io/projected/f20e5484-d50d-4f65-aea6-720b3706e2ff-kube-api-access-78p5f\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.561099 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20e5484-d50d-4f65-aea6-720b3706e2ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.798481 4610 generic.go:334] "Generic (PLEG): container finished" podID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerID="6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800" exitCode=0 Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.798600 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtxck" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.798603 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerDied","Data":"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800"} Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.798766 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtxck" event={"ID":"f20e5484-d50d-4f65-aea6-720b3706e2ff","Type":"ContainerDied","Data":"ba75a0df952dfbc4a6af04ab5bd22638697e1de8148f34dd5dfdfa2c0ea0267e"} Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.798797 4610 scope.go:117] "RemoveContainer" containerID="6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.801827 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9a73cfa7-ef0b-4dda-9ca4-de80de751a61","Type":"ContainerStarted","Data":"2f746a13038f162f116d479d8a1b5ae6e0383f9e0639edabf42f20ece70d24ef"} Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.833528 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.388694528 podStartE2EDuration="3.833497213s" podCreationTimestamp="2025-10-06 10:01:05 +0000 UTC" firstStartedPulling="2025-10-06 10:01:06.899118101 +0000 UTC m=+4798.614171489" lastFinishedPulling="2025-10-06 10:01:08.343920786 +0000 UTC m=+4800.058974174" observedRunningTime="2025-10-06 10:01:08.821728214 +0000 UTC m=+4800.536781632" watchObservedRunningTime="2025-10-06 10:01:08.833497213 +0000 UTC m=+4800.548550641" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.842986 4610 scope.go:117] "RemoveContainer" containerID="b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.862837 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.872499 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jtxck"] Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.879956 4610 scope.go:117] "RemoveContainer" containerID="0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.914167 4610 scope.go:117] "RemoveContainer" containerID="6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800" Oct 06 10:01:08 crc kubenswrapper[4610]: E1006 10:01:08.914832 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800\": container with ID starting with 6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800 not found: ID does not exist" containerID="6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.914961 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800"} err="failed to get container status \"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800\": rpc error: code = NotFound desc = could not find container \"6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800\": container with ID starting with 6866c734b1172edacaed5c054301eb311249aaa62e90f379403f4039d5228800 not found: ID does not exist" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.915086 4610 scope.go:117] "RemoveContainer" containerID="b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0" Oct 06 10:01:08 crc kubenswrapper[4610]: E1006 10:01:08.915849 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0\": container with ID starting with b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0 not found: ID does not exist" containerID="b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.916000 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0"} err="failed to get container status \"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0\": rpc error: code = NotFound desc = could not find container \"b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0\": container with ID starting with b444fcdf1e4d11a88edf4e68fdb157c3898f5637b1e25da321b595c8869d5bf0 not found: ID does not exist" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.916161 4610 scope.go:117] "RemoveContainer" containerID="0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078" Oct 06 10:01:08 crc kubenswrapper[4610]: E1006 10:01:08.916669 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078\": container with ID starting with 0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078 not found: ID does not exist" containerID="0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078" Oct 06 10:01:08 crc kubenswrapper[4610]: I1006 10:01:08.916825 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078"} err="failed to get container status \"0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078\": rpc error: code = NotFound desc = could not find container \"0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078\": container with ID starting with 0593ea19571b044d40354d8a67379dc7f8fa72b68b28dec82bbe9e26a6be9078 not found: ID does not exist" Oct 06 10:01:09 crc kubenswrapper[4610]: I1006 10:01:09.117348 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" path="/var/lib/kubelet/pods/f20e5484-d50d-4f65-aea6-720b3706e2ff/volumes" Oct 06 10:01:16 crc kubenswrapper[4610]: I1006 10:01:16.469814 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:01:16 crc kubenswrapper[4610]: I1006 10:01:16.470456 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.776972 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcrz5/must-gather-6jtdk"] Oct 06 10:01:24 crc kubenswrapper[4610]: E1006 10:01:24.788264 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b1f441-39d4-4c07-8696-045031364dd2" containerName="keystone-cron" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788294 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b1f441-39d4-4c07-8696-045031364dd2" containerName="keystone-cron" Oct 06 10:01:24 crc kubenswrapper[4610]: E1006 10:01:24.788309 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="registry-server" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788316 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="registry-server" Oct 06 10:01:24 crc kubenswrapper[4610]: E1006 10:01:24.788348 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="extract-utilities" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788358 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="extract-utilities" Oct 06 10:01:24 crc kubenswrapper[4610]: E1006 10:01:24.788372 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="extract-content" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788377 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="extract-content" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788557 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20e5484-d50d-4f65-aea6-720b3706e2ff" containerName="registry-server" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.788571 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b1f441-39d4-4c07-8696-045031364dd2" containerName="keystone-cron" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.789558 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.799406 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wcrz5"/"kube-root-ca.crt" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.799599 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wcrz5"/"openshift-service-ca.crt" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.803340 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wcrz5"/"default-dockercfg-26b9c" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.805868 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcrz5/must-gather-6jtdk"] Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.868122 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhdf\" (UniqueName: \"kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.868173 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.971756 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhdf\" (UniqueName: \"kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.971829 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:24 crc kubenswrapper[4610]: I1006 10:01:24.972297 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:25 crc kubenswrapper[4610]: I1006 10:01:24.990682 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhdf\" (UniqueName: \"kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf\") pod \"must-gather-6jtdk\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:25 crc kubenswrapper[4610]: I1006 10:01:25.143493 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:01:25 crc kubenswrapper[4610]: I1006 10:01:25.650095 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcrz5/must-gather-6jtdk"] Oct 06 10:01:25 crc kubenswrapper[4610]: I1006 10:01:25.986394 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" event={"ID":"b36935b4-d00d-4548-a6f6-4b838fa76ec1","Type":"ContainerStarted","Data":"7d3465a9dda868cbc7645253bb76db21c87b4a1b27e1db6b813df829e036601f"} Oct 06 10:01:31 crc kubenswrapper[4610]: I1006 10:01:31.043957 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" event={"ID":"b36935b4-d00d-4548-a6f6-4b838fa76ec1","Type":"ContainerStarted","Data":"5bb09b43df46ceb743a66b8ac6e677dc18b88543b00e41f3e2ddaa1d2a6fce1e"} Oct 06 10:01:31 crc kubenswrapper[4610]: I1006 10:01:31.044590 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" event={"ID":"b36935b4-d00d-4548-a6f6-4b838fa76ec1","Type":"ContainerStarted","Data":"cf2095d28d96a94ae582c01e05818d4dfb4f119dbc2940f6442114b364ab6a4c"} Oct 06 10:01:31 crc kubenswrapper[4610]: I1006 10:01:31.064422 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" podStartSLOduration=2.444891051 podStartE2EDuration="7.064376135s" podCreationTimestamp="2025-10-06 10:01:24 +0000 UTC" firstStartedPulling="2025-10-06 10:01:25.654944522 +0000 UTC m=+4817.369997940" lastFinishedPulling="2025-10-06 10:01:30.274429636 +0000 UTC m=+4821.989483024" observedRunningTime="2025-10-06 10:01:31.061983972 +0000 UTC m=+4822.777037370" watchObservedRunningTime="2025-10-06 10:01:31.064376135 +0000 UTC m=+4822.779429533" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.143234 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-7qdqp"] Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.145751 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.176897 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.177066 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7cc\" (UniqueName: \"kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.278629 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.279276 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7cc\" (UniqueName: \"kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.279140 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.301696 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7cc\" (UniqueName: \"kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc\") pod \"crc-debug-7qdqp\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:35 crc kubenswrapper[4610]: I1006 10:01:35.466343 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:01:36 crc kubenswrapper[4610]: I1006 10:01:36.093438 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" event={"ID":"1dc7eda8-2bf1-4115-a854-864f4510aee6","Type":"ContainerStarted","Data":"9022555ab6937e520a5d86b29da19b0a9648632f61c49b1f5987201495cacfaa"} Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.211725 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" event={"ID":"1dc7eda8-2bf1-4115-a854-864f4510aee6","Type":"ContainerStarted","Data":"e52e77a19dd9badac8c549d66f315b2f35d79c6725d0cddb882b748283693a18"} Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.226923 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" podStartSLOduration=1.7715504119999999 podStartE2EDuration="11.226907269s" podCreationTimestamp="2025-10-06 10:01:35 +0000 UTC" firstStartedPulling="2025-10-06 10:01:35.500487147 +0000 UTC m=+4827.215540535" lastFinishedPulling="2025-10-06 10:01:44.955844004 +0000 UTC m=+4836.670897392" observedRunningTime="2025-10-06 10:01:46.22314464 +0000 UTC m=+4837.938198028" watchObservedRunningTime="2025-10-06 10:01:46.226907269 +0000 UTC m=+4837.941960657" Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.469484 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.469546 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.469596 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.470384 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 10:01:46 crc kubenswrapper[4610]: I1006 10:01:46.470443 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6" gracePeriod=600 Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.223388 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6" exitCode=0 Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.224584 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6"} Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.224615 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af"} Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.224633 4610 scope.go:117] "RemoveContainer" containerID="53960739d0f6f824be8e16a2c544476245396cd65a86bf61e807cca1bfee9e21" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.589406 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.591678 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.600745 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.606369 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.606504 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.606603 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzf4n\" (UniqueName: \"kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.709153 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.708567 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.709534 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.709851 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzf4n\" (UniqueName: \"kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.710340 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.740534 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzf4n\" (UniqueName: \"kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n\") pod \"redhat-marketplace-7gt5l\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:47 crc kubenswrapper[4610]: I1006 10:01:47.936389 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:48 crc kubenswrapper[4610]: I1006 10:01:48.446269 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:01:48 crc kubenswrapper[4610]: W1006 10:01:48.450836 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c5216c1_a820_47c5_b334_1636537737fe.slice/crio-94a480d862c88fcb94af9277cbbdb35e962f607c5fd014983cd0df24da78b75a WatchSource:0}: Error finding container 94a480d862c88fcb94af9277cbbdb35e962f607c5fd014983cd0df24da78b75a: Status 404 returned error can't find the container with id 94a480d862c88fcb94af9277cbbdb35e962f607c5fd014983cd0df24da78b75a Oct 06 10:01:49 crc kubenswrapper[4610]: I1006 10:01:49.244685 4610 generic.go:334] "Generic (PLEG): container finished" podID="6c5216c1-a820-47c5-b334-1636537737fe" containerID="cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30" exitCode=0 Oct 06 10:01:49 crc kubenswrapper[4610]: I1006 10:01:49.244758 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerDied","Data":"cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30"} Oct 06 10:01:49 crc kubenswrapper[4610]: I1006 10:01:49.245333 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerStarted","Data":"94a480d862c88fcb94af9277cbbdb35e962f607c5fd014983cd0df24da78b75a"} Oct 06 10:01:52 crc kubenswrapper[4610]: I1006 10:01:52.277097 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerStarted","Data":"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292"} Oct 06 10:01:53 crc kubenswrapper[4610]: I1006 10:01:53.294077 4610 generic.go:334] "Generic (PLEG): container finished" podID="6c5216c1-a820-47c5-b334-1636537737fe" containerID="527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292" exitCode=0 Oct 06 10:01:53 crc kubenswrapper[4610]: I1006 10:01:53.294181 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerDied","Data":"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292"} Oct 06 10:01:54 crc kubenswrapper[4610]: I1006 10:01:54.305870 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerStarted","Data":"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42"} Oct 06 10:01:54 crc kubenswrapper[4610]: I1006 10:01:54.323153 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7gt5l" podStartSLOduration=2.870224507 podStartE2EDuration="7.32313593s" podCreationTimestamp="2025-10-06 10:01:47 +0000 UTC" firstStartedPulling="2025-10-06 10:01:49.246687674 +0000 UTC m=+4840.961741062" lastFinishedPulling="2025-10-06 10:01:53.699599097 +0000 UTC m=+4845.414652485" observedRunningTime="2025-10-06 10:01:54.319864164 +0000 UTC m=+4846.034917552" watchObservedRunningTime="2025-10-06 10:01:54.32313593 +0000 UTC m=+4846.038189318" Oct 06 10:01:57 crc kubenswrapper[4610]: I1006 10:01:57.937174 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:57 crc kubenswrapper[4610]: I1006 10:01:57.937719 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:57 crc kubenswrapper[4610]: I1006 10:01:57.992929 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:58 crc kubenswrapper[4610]: I1006 10:01:58.407487 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:01:58 crc kubenswrapper[4610]: I1006 10:01:58.472861 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:02:00 crc kubenswrapper[4610]: I1006 10:02:00.372854 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7gt5l" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="registry-server" containerID="cri-o://3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42" gracePeriod=2 Oct 06 10:02:00 crc kubenswrapper[4610]: I1006 10:02:00.863807 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.050262 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzf4n\" (UniqueName: \"kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n\") pod \"6c5216c1-a820-47c5-b334-1636537737fe\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.050375 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content\") pod \"6c5216c1-a820-47c5-b334-1636537737fe\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.050423 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities\") pod \"6c5216c1-a820-47c5-b334-1636537737fe\" (UID: \"6c5216c1-a820-47c5-b334-1636537737fe\") " Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.051180 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities" (OuterVolumeSpecName: "utilities") pod "6c5216c1-a820-47c5-b334-1636537737fe" (UID: "6c5216c1-a820-47c5-b334-1636537737fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.070505 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n" (OuterVolumeSpecName: "kube-api-access-rzf4n") pod "6c5216c1-a820-47c5-b334-1636537737fe" (UID: "6c5216c1-a820-47c5-b334-1636537737fe"). InnerVolumeSpecName "kube-api-access-rzf4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.080253 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c5216c1-a820-47c5-b334-1636537737fe" (UID: "6c5216c1-a820-47c5-b334-1636537737fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.153433 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.153503 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c5216c1-a820-47c5-b334-1636537737fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.153520 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzf4n\" (UniqueName: \"kubernetes.io/projected/6c5216c1-a820-47c5-b334-1636537737fe-kube-api-access-rzf4n\") on node \"crc\" DevicePath \"\"" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.384486 4610 generic.go:334] "Generic (PLEG): container finished" podID="6c5216c1-a820-47c5-b334-1636537737fe" containerID="3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42" exitCode=0 Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.384707 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerDied","Data":"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42"} Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.384794 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gt5l" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.384807 4610 scope.go:117] "RemoveContainer" containerID="3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.384792 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gt5l" event={"ID":"6c5216c1-a820-47c5-b334-1636537737fe","Type":"ContainerDied","Data":"94a480d862c88fcb94af9277cbbdb35e962f607c5fd014983cd0df24da78b75a"} Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.407699 4610 scope.go:117] "RemoveContainer" containerID="527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.422400 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.439897 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gt5l"] Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.470675 4610 scope.go:117] "RemoveContainer" containerID="cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.525564 4610 scope.go:117] "RemoveContainer" containerID="3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42" Oct 06 10:02:01 crc kubenswrapper[4610]: E1006 10:02:01.526579 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42\": container with ID starting with 3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42 not found: ID does not exist" containerID="3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.526608 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42"} err="failed to get container status \"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42\": rpc error: code = NotFound desc = could not find container \"3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42\": container with ID starting with 3364c2372455a12558189181cce1a42793e121acaaf6f7628e27241cadcc4c42 not found: ID does not exist" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.526627 4610 scope.go:117] "RemoveContainer" containerID="527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292" Oct 06 10:02:01 crc kubenswrapper[4610]: E1006 10:02:01.526995 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292\": container with ID starting with 527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292 not found: ID does not exist" containerID="527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.527016 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292"} err="failed to get container status \"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292\": rpc error: code = NotFound desc = could not find container \"527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292\": container with ID starting with 527e38a6dca63a0d83cbf7d3385b1af35dbc32dbab63231c800f13fd84b9e292 not found: ID does not exist" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.527030 4610 scope.go:117] "RemoveContainer" containerID="cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30" Oct 06 10:02:01 crc kubenswrapper[4610]: E1006 10:02:01.527310 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30\": container with ID starting with cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30 not found: ID does not exist" containerID="cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30" Oct 06 10:02:01 crc kubenswrapper[4610]: I1006 10:02:01.527343 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30"} err="failed to get container status \"cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30\": rpc error: code = NotFound desc = could not find container \"cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30\": container with ID starting with cce32beb650f45d76f8e5a04b71343c4f02bc0e093b5094a4431bfc406bb5d30 not found: ID does not exist" Oct 06 10:02:03 crc kubenswrapper[4610]: I1006 10:02:03.082535 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5216c1-a820-47c5-b334-1636537737fe" path="/var/lib/kubelet/pods/6c5216c1-a820-47c5-b334-1636537737fe/volumes" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.188150 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6849467754-2xpgn_8416ea1e-d79e-4dc2-8902-d59c8c4bbc60/barbican-api-log/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.197847 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6849467754-2xpgn_8416ea1e-d79e-4dc2-8902-d59c8c4bbc60/barbican-api/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.468104 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79f9c7c798-mtvls_9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d/barbican-keystone-listener/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.512913 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79f9c7c798-mtvls_9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d/barbican-keystone-listener-log/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.711885 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776cb49575-gr6gq_8e13d8e7-f118-4d18-ab69-162dadc7f649/barbican-worker-log/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.738741 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776cb49575-gr6gq_8e13d8e7-f118-4d18-ab69-162dadc7f649/barbican-worker/0.log" Oct 06 10:03:02 crc kubenswrapper[4610]: I1006 10:03:02.968296 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv_290e102b-3121-4f44-b861-2b2e2e297f7b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.146967 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/ceilometer-central-agent/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.583279 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/sg-core/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.685643 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/ceilometer-notification-agent/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.686137 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/proxy-httpd/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.886122 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_279bb64b-8fba-4afc-9ded-6bd2375521ba/cinder-api-log/0.log" Oct 06 10:03:03 crc kubenswrapper[4610]: I1006 10:03:03.956242 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_279bb64b-8fba-4afc-9ded-6bd2375521ba/cinder-api/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.120878 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e29afc72-dbf0-453c-b96b-42d0399d6286/cinder-scheduler/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.228281 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e29afc72-dbf0-453c-b96b-42d0399d6286/probe/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.446315 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6_489d3203-794d-455a-b5b8-97f933b8db19/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.682393 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-687wd_b87a175d-5d06-4825-981c-ed2cf97fb652/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.839966 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6_e529d0f1-3da5-4178-8720-5769624f4490/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:04 crc kubenswrapper[4610]: I1006 10:03:04.925237 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/init/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.109737 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/init/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.256765 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/dnsmasq-dns/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.486119 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz_460008b6-6b5b-43a1-b806-01340e52e472/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.541298 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_225e171a-3dd8-4d73-af22-fa01ef4a7359/glance-httpd/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.611516 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_225e171a-3dd8-4d73-af22-fa01ef4a7359/glance-log/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.845480 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ba3a861e-9618-4947-9e23-c285ec4d43a6/glance-httpd/0.log" Oct 06 10:03:05 crc kubenswrapper[4610]: I1006 10:03:05.939587 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ba3a861e-9618-4947-9e23-c285ec4d43a6/glance-log/0.log" Oct 06 10:03:06 crc kubenswrapper[4610]: I1006 10:03:06.199522 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868f4bc56b-f2np4_0843392c-2df1-4619-9745-21ca7d06a589/horizon/0.log" Oct 06 10:03:06 crc kubenswrapper[4610]: I1006 10:03:06.367170 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ps99p_7e49b85b-bbed-4c13-b513-3d61369aa3c0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:06 crc kubenswrapper[4610]: I1006 10:03:06.541673 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868f4bc56b-f2np4_0843392c-2df1-4619-9745-21ca7d06a589/horizon-log/0.log" Oct 06 10:03:06 crc kubenswrapper[4610]: I1006 10:03:06.611239 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rcwcf_b1e06674-8934-4170-9b16-5bf7292977ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:06 crc kubenswrapper[4610]: I1006 10:03:06.891192 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329081-rlm67_06b1f441-39d4-4c07-8696-045031364dd2/keystone-cron/0.log" Oct 06 10:03:07 crc kubenswrapper[4610]: I1006 10:03:07.194359 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84759bdbdc-r6gkv_27ee29ca-3774-42c0-a3d0-164644f89e7d/keystone-api/0.log" Oct 06 10:03:07 crc kubenswrapper[4610]: I1006 10:03:07.266917 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cdd44fea-d46e-45e1-be47-89cc8a1f63c7/kube-state-metrics/0.log" Oct 06 10:03:07 crc kubenswrapper[4610]: I1006 10:03:07.399173 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9gsth_7198dacf-4e83-415a-a302-d543a7c2fea9/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:08 crc kubenswrapper[4610]: I1006 10:03:08.030851 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69d948d6bf-n5vv6_f61e2bff-9119-4208-a7a0-c8da777e049b/neutron-httpd/0.log" Oct 06 10:03:08 crc kubenswrapper[4610]: I1006 10:03:08.143505 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69d948d6bf-n5vv6_f61e2bff-9119-4208-a7a0-c8da777e049b/neutron-api/0.log" Oct 06 10:03:08 crc kubenswrapper[4610]: I1006 10:03:08.151444 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn_4288043c-e9b4-4c1c-8234-3f44be6fbc2f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:09 crc kubenswrapper[4610]: I1006 10:03:09.113544 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb2657e1-8319-4a7d-be1f-a48d66bd5ba8/nova-cell0-conductor-conductor/0.log" Oct 06 10:03:09 crc kubenswrapper[4610]: I1006 10:03:09.540192 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49c7e6ea-9091-4658-bc36-3c82f6f25682/nova-api-log/0.log" Oct 06 10:03:09 crc kubenswrapper[4610]: I1006 10:03:09.739108 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992/nova-cell1-conductor-conductor/0.log" Oct 06 10:03:09 crc kubenswrapper[4610]: I1006 10:03:09.877575 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49c7e6ea-9091-4658-bc36-3c82f6f25682/nova-api-api/0.log" Oct 06 10:03:10 crc kubenswrapper[4610]: I1006 10:03:10.074255 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_760824cd-931b-4588-85d0-8b0548fc8c38/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 10:03:10 crc kubenswrapper[4610]: I1006 10:03:10.504014 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jxfdh_3ba105b9-3b48-4236-a86d-6fcded83393a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:10 crc kubenswrapper[4610]: I1006 10:03:10.719076 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_65ccdb5a-a886-4df8-9f4c-9bccb814641a/nova-metadata-log/0.log" Oct 06 10:03:11 crc kubenswrapper[4610]: I1006 10:03:11.441240 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_272d54f3-da6d-4d44-b723-956ac2cc65a4/nova-scheduler-scheduler/0.log" Oct 06 10:03:11 crc kubenswrapper[4610]: I1006 10:03:11.935365 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/mysql-bootstrap/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.141310 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/mysql-bootstrap/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.182505 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/galera/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.272155 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_65ccdb5a-a886-4df8-9f4c-9bccb814641a/nova-metadata-metadata/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.464641 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/mysql-bootstrap/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.721398 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/galera/0.log" Oct 06 10:03:12 crc kubenswrapper[4610]: I1006 10:03:12.758750 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/mysql-bootstrap/0.log" Oct 06 10:03:13 crc kubenswrapper[4610]: I1006 10:03:13.003819 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5b7473c8-fdfd-426a-99da-57bc4175e303/openstackclient/0.log" Oct 06 10:03:13 crc kubenswrapper[4610]: I1006 10:03:13.035465 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6hjff_1e77ce11-f629-48ab-820e-e67fbfc3ba57/ovn-controller/0.log" Oct 06 10:03:13 crc kubenswrapper[4610]: I1006 10:03:13.202008 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-62bzc_cfae3507-92ed-4d33-85d2-b5a0c3beed93/openstack-network-exporter/0.log" Oct 06 10:03:13 crc kubenswrapper[4610]: I1006 10:03:13.788182 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server-init/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.004030 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server-init/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.035700 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.040648 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovs-vswitchd/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.419941 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gbnnk_e9eecc46-8a50-486b-ae37-2ba0f62b5216/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.555723 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f726409a-ab18-426c-84e7-2d8ae473a3d4/openstack-network-exporter/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.608765 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f726409a-ab18-426c-84e7-2d8ae473a3d4/ovn-northd/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.807115 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea778a76-1f2e-4289-8b2f-7ccc1975eb3d/openstack-network-exporter/0.log" Oct 06 10:03:14 crc kubenswrapper[4610]: I1006 10:03:14.873924 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea778a76-1f2e-4289-8b2f-7ccc1975eb3d/ovsdbserver-nb/0.log" Oct 06 10:03:15 crc kubenswrapper[4610]: I1006 10:03:15.006760 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f51717ef-7ac5-45b1-ae7c-beddba660645/openstack-network-exporter/0.log" Oct 06 10:03:15 crc kubenswrapper[4610]: I1006 10:03:15.213058 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f51717ef-7ac5-45b1-ae7c-beddba660645/ovsdbserver-sb/0.log" Oct 06 10:03:15 crc kubenswrapper[4610]: I1006 10:03:15.669676 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7659d8b-729ds_a417a1f5-8eba-4d85-9b6e-730463fe2734/placement-api/0.log" Oct 06 10:03:15 crc kubenswrapper[4610]: I1006 10:03:15.824569 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7659d8b-729ds_a417a1f5-8eba-4d85-9b6e-730463fe2734/placement-log/0.log" Oct 06 10:03:15 crc kubenswrapper[4610]: I1006 10:03:15.871420 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/setup-container/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.087657 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/setup-container/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.186687 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/rabbitmq/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.377997 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/setup-container/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.640463 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/setup-container/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.704614 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/rabbitmq/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.945507 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2dstt_4dd4792e-84d5-41ff-bc84-b3d0bde5377a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:16 crc kubenswrapper[4610]: I1006 10:03:16.960653 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw_f1784103-7612-4a23-9135-eb81df0fe2ce/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:17 crc kubenswrapper[4610]: I1006 10:03:17.332068 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9_f70ce47b-f642-41e9-8649-7dc466c07c27/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:17 crc kubenswrapper[4610]: I1006 10:03:17.471438 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fgvht_5c30c0cb-9027-4935-bca1-0debc398c091/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:17 crc kubenswrapper[4610]: I1006 10:03:17.684571 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4l44_ee7bec6c-22cc-448e-8939-798d80db2045/ssh-known-hosts-edpm-deployment/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.009200 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c65b98c55-xjdpw_2d4cceaf-e744-49da-a634-84401f61d862/proxy-server/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.104436 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c65b98c55-xjdpw_2d4cceaf-e744-49da-a634-84401f61d862/proxy-httpd/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.478536 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pv9bk_83179f37-2a3e-4b31-8d5e-fcdaf56961a5/swift-ring-rebalance/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.711187 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-auditor/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.731241 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-reaper/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.859549 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-replicator/0.log" Oct 06 10:03:18 crc kubenswrapper[4610]: I1006 10:03:18.972931 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-server/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.024505 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-auditor/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.214210 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-replicator/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.271665 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-updater/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.295719 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-server/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.473302 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-expirer/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.552487 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-auditor/0.log" Oct 06 10:03:19 crc kubenswrapper[4610]: I1006 10:03:19.567928 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-replicator/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.160960 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/rsync/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.197600 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-updater/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.236265 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-server/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.541175 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/swift-recon-cron/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.579103 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd_a11ef1e8-ba4f-4b82-adad-cbe054665d4c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:20 crc kubenswrapper[4610]: I1006 10:03:20.831715 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6effef24-402a-46e6-a15a-02815ef810ae/tempest-tests-tempest-tests-runner/0.log" Oct 06 10:03:21 crc kubenswrapper[4610]: I1006 10:03:21.132785 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5_d37ed6ae-3ad3-4604-9149-4e2b8006375e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:03:21 crc kubenswrapper[4610]: I1006 10:03:21.154424 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9a73cfa7-ef0b-4dda-9ca4-de80de751a61/test-operator-logs-container/0.log" Oct 06 10:03:29 crc kubenswrapper[4610]: I1006 10:03:29.147543 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ef8e8806-0063-480d-933b-5a6c760d503e/memcached/0.log" Oct 06 10:03:46 crc kubenswrapper[4610]: I1006 10:03:46.469532 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:03:46 crc kubenswrapper[4610]: I1006 10:03:46.470105 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:03:52 crc kubenswrapper[4610]: I1006 10:03:52.355503 4610 generic.go:334] "Generic (PLEG): container finished" podID="1dc7eda8-2bf1-4115-a854-864f4510aee6" containerID="e52e77a19dd9badac8c549d66f315b2f35d79c6725d0cddb882b748283693a18" exitCode=0 Oct 06 10:03:52 crc kubenswrapper[4610]: I1006 10:03:52.355556 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" event={"ID":"1dc7eda8-2bf1-4115-a854-864f4510aee6","Type":"ContainerDied","Data":"e52e77a19dd9badac8c549d66f315b2f35d79c6725d0cddb882b748283693a18"} Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.489317 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.528929 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-7qdqp"] Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.537239 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-7qdqp"] Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.547309 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf7cc\" (UniqueName: \"kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc\") pod \"1dc7eda8-2bf1-4115-a854-864f4510aee6\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.547625 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host\") pod \"1dc7eda8-2bf1-4115-a854-864f4510aee6\" (UID: \"1dc7eda8-2bf1-4115-a854-864f4510aee6\") " Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.547677 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host" (OuterVolumeSpecName: "host") pod "1dc7eda8-2bf1-4115-a854-864f4510aee6" (UID: "1dc7eda8-2bf1-4115-a854-864f4510aee6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:03:53 crc kubenswrapper[4610]: I1006 10:03:53.548113 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1dc7eda8-2bf1-4115-a854-864f4510aee6-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:03:54 crc kubenswrapper[4610]: I1006 10:03:54.091290 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc" (OuterVolumeSpecName: "kube-api-access-pf7cc") pod "1dc7eda8-2bf1-4115-a854-864f4510aee6" (UID: "1dc7eda8-2bf1-4115-a854-864f4510aee6"). InnerVolumeSpecName "kube-api-access-pf7cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:03:54 crc kubenswrapper[4610]: I1006 10:03:54.159967 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf7cc\" (UniqueName: \"kubernetes.io/projected/1dc7eda8-2bf1-4115-a854-864f4510aee6-kube-api-access-pf7cc\") on node \"crc\" DevicePath \"\"" Oct 06 10:03:54 crc kubenswrapper[4610]: I1006 10:03:54.383669 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9022555ab6937e520a5d86b29da19b0a9648632f61c49b1f5987201495cacfaa" Oct 06 10:03:54 crc kubenswrapper[4610]: I1006 10:03:54.383674 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-7qdqp" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.087518 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc7eda8-2bf1-4115-a854-864f4510aee6" path="/var/lib/kubelet/pods/1dc7eda8-2bf1-4115-a854-864f4510aee6/volumes" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321095 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-l5hvl"] Oct 06 10:03:55 crc kubenswrapper[4610]: E1006 10:03:55.321500 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="extract-utilities" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321523 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="extract-utilities" Oct 06 10:03:55 crc kubenswrapper[4610]: E1006 10:03:55.321576 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="registry-server" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321584 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="registry-server" Oct 06 10:03:55 crc kubenswrapper[4610]: E1006 10:03:55.321602 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="extract-content" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321611 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="extract-content" Oct 06 10:03:55 crc kubenswrapper[4610]: E1006 10:03:55.321623 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc7eda8-2bf1-4115-a854-864f4510aee6" containerName="container-00" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321632 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc7eda8-2bf1-4115-a854-864f4510aee6" containerName="container-00" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321894 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5216c1-a820-47c5-b334-1636537737fe" containerName="registry-server" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.321928 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc7eda8-2bf1-4115-a854-864f4510aee6" containerName="container-00" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.322743 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.415833 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xh77\" (UniqueName: \"kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.415939 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.517773 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xh77\" (UniqueName: \"kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.517899 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:55 crc kubenswrapper[4610]: I1006 10:03:55.518216 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:56 crc kubenswrapper[4610]: I1006 10:03:56.090424 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xh77\" (UniqueName: \"kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77\") pod \"crc-debug-l5hvl\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:56 crc kubenswrapper[4610]: I1006 10:03:56.246456 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:57 crc kubenswrapper[4610]: I1006 10:03:57.419944 4610 generic.go:334] "Generic (PLEG): container finished" podID="f190b18e-ee02-40a1-ba70-9b8fd0aab834" containerID="0c383ae48c7413245fd1375dcc0e3080f2b36d823e000a0040c0bbdf567f1ba6" exitCode=0 Oct 06 10:03:57 crc kubenswrapper[4610]: I1006 10:03:57.420100 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" event={"ID":"f190b18e-ee02-40a1-ba70-9b8fd0aab834","Type":"ContainerDied","Data":"0c383ae48c7413245fd1375dcc0e3080f2b36d823e000a0040c0bbdf567f1ba6"} Oct 06 10:03:57 crc kubenswrapper[4610]: I1006 10:03:57.420446 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" event={"ID":"f190b18e-ee02-40a1-ba70-9b8fd0aab834","Type":"ContainerStarted","Data":"3829c4083fa85614cd26a9e19639c704e0a8f9f2268e82c1a456d33c25f512d6"} Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.516464 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.683837 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host" (OuterVolumeSpecName: "host") pod "f190b18e-ee02-40a1-ba70-9b8fd0aab834" (UID: "f190b18e-ee02-40a1-ba70-9b8fd0aab834"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.684103 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host\") pod \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.684209 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xh77\" (UniqueName: \"kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77\") pod \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\" (UID: \"f190b18e-ee02-40a1-ba70-9b8fd0aab834\") " Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.686096 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b18e-ee02-40a1-ba70-9b8fd0aab834-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.692168 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77" (OuterVolumeSpecName: "kube-api-access-9xh77") pod "f190b18e-ee02-40a1-ba70-9b8fd0aab834" (UID: "f190b18e-ee02-40a1-ba70-9b8fd0aab834"). InnerVolumeSpecName "kube-api-access-9xh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:03:58 crc kubenswrapper[4610]: I1006 10:03:58.789795 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xh77\" (UniqueName: \"kubernetes.io/projected/f190b18e-ee02-40a1-ba70-9b8fd0aab834-kube-api-access-9xh77\") on node \"crc\" DevicePath \"\"" Oct 06 10:03:59 crc kubenswrapper[4610]: I1006 10:03:59.433732 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" event={"ID":"f190b18e-ee02-40a1-ba70-9b8fd0aab834","Type":"ContainerDied","Data":"3829c4083fa85614cd26a9e19639c704e0a8f9f2268e82c1a456d33c25f512d6"} Oct 06 10:03:59 crc kubenswrapper[4610]: I1006 10:03:59.433771 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3829c4083fa85614cd26a9e19639c704e0a8f9f2268e82c1a456d33c25f512d6" Oct 06 10:03:59 crc kubenswrapper[4610]: I1006 10:03:59.433806 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-l5hvl" Oct 06 10:04:04 crc kubenswrapper[4610]: I1006 10:04:04.784412 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-l5hvl"] Oct 06 10:04:04 crc kubenswrapper[4610]: I1006 10:04:04.792387 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-l5hvl"] Oct 06 10:04:05 crc kubenswrapper[4610]: I1006 10:04:05.088754 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f190b18e-ee02-40a1-ba70-9b8fd0aab834" path="/var/lib/kubelet/pods/f190b18e-ee02-40a1-ba70-9b8fd0aab834/volumes" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.039598 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-rcs6h"] Oct 06 10:04:06 crc kubenswrapper[4610]: E1006 10:04:06.040197 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f190b18e-ee02-40a1-ba70-9b8fd0aab834" containerName="container-00" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.040218 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f190b18e-ee02-40a1-ba70-9b8fd0aab834" containerName="container-00" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.040591 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f190b18e-ee02-40a1-ba70-9b8fd0aab834" containerName="container-00" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.042676 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.209225 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wnc6\" (UniqueName: \"kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.209340 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.311339 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wnc6\" (UniqueName: \"kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.311946 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.312129 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.338518 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wnc6\" (UniqueName: \"kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6\") pod \"crc-debug-rcs6h\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.366814 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:06 crc kubenswrapper[4610]: I1006 10:04:06.494684 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" event={"ID":"f4be2800-1a4b-4cd5-b9b6-886d2a123683","Type":"ContainerStarted","Data":"b3a61b8c6655238c3512e4033ebcfa499d322d7775b0e580751c73627044a413"} Oct 06 10:04:07 crc kubenswrapper[4610]: I1006 10:04:07.509908 4610 generic.go:334] "Generic (PLEG): container finished" podID="f4be2800-1a4b-4cd5-b9b6-886d2a123683" containerID="7dd835f296e9a73a1c9eff3b184bb50f1819f73a8753bb0ccc4597ac29dacf3a" exitCode=0 Oct 06 10:04:07 crc kubenswrapper[4610]: I1006 10:04:07.510123 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" event={"ID":"f4be2800-1a4b-4cd5-b9b6-886d2a123683","Type":"ContainerDied","Data":"7dd835f296e9a73a1c9eff3b184bb50f1819f73a8753bb0ccc4597ac29dacf3a"} Oct 06 10:04:07 crc kubenswrapper[4610]: I1006 10:04:07.552100 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-rcs6h"] Oct 06 10:04:07 crc kubenswrapper[4610]: I1006 10:04:07.557916 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wcrz5/crc-debug-rcs6h"] Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.622801 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.758818 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host\") pod \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.759306 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wnc6\" (UniqueName: \"kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6\") pod \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\" (UID: \"f4be2800-1a4b-4cd5-b9b6-886d2a123683\") " Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.758938 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host" (OuterVolumeSpecName: "host") pod "f4be2800-1a4b-4cd5-b9b6-886d2a123683" (UID: "f4be2800-1a4b-4cd5-b9b6-886d2a123683"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.759859 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4be2800-1a4b-4cd5-b9b6-886d2a123683-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.764686 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6" (OuterVolumeSpecName: "kube-api-access-6wnc6") pod "f4be2800-1a4b-4cd5-b9b6-886d2a123683" (UID: "f4be2800-1a4b-4cd5-b9b6-886d2a123683"). InnerVolumeSpecName "kube-api-access-6wnc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:04:08 crc kubenswrapper[4610]: I1006 10:04:08.861383 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wnc6\" (UniqueName: \"kubernetes.io/projected/f4be2800-1a4b-4cd5-b9b6-886d2a123683-kube-api-access-6wnc6\") on node \"crc\" DevicePath \"\"" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.085660 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4be2800-1a4b-4cd5-b9b6-886d2a123683" path="/var/lib/kubelet/pods/f4be2800-1a4b-4cd5-b9b6-886d2a123683/volumes" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.218896 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-gcbb8_e8d37aed-cf46-47a0-a8ea-cfec57404966/kube-rbac-proxy/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.276597 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-gcbb8_e8d37aed-cf46-47a0-a8ea-cfec57404966/manager/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.414276 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.525455 4610 scope.go:117] "RemoveContainer" containerID="7dd835f296e9a73a1c9eff3b184bb50f1819f73a8753bb0ccc4597ac29dacf3a" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.525775 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/crc-debug-rcs6h" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.577284 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.613588 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.614792 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.782799 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.838569 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/extract/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.862814 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:04:09 crc kubenswrapper[4610]: I1006 10:04:09.983739 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-65ffb_590d1736-08ea-4b24-9462-51e4f9eb2169/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.004879 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-65ffb_590d1736-08ea-4b24-9462-51e4f9eb2169/manager/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.114625 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-f6dh9_23749a1a-8450-4412-850b-1e044d290c69/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.218263 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-f6dh9_23749a1a-8450-4412-850b-1e044d290c69/manager/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.332977 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-z448l_55d8474b-1188-4617-abe4-d5e45d9a85cb/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.428544 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-z448l_55d8474b-1188-4617-abe4-d5e45d9a85cb/manager/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.488556 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-xfwfl_08b5e994-103b-40ba-aef6-4dd36e5c456e/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.575503 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-xfwfl_08b5e994-103b-40ba-aef6-4dd36e5c456e/manager/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.686830 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-cqqbc_b63b18e4-4aee-4a86-a5cb-23393a3cfaa3/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.693453 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-cqqbc_b63b18e4-4aee-4a86-a5cb-23393a3cfaa3/manager/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.729318 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-4hldf_aa003bf3-ca26-468d-975a-5ceaa0361f14/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.889195 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-575v4_8c2d89eb-7d33-4268-901a-69b008f224d4/kube-rbac-proxy/0.log" Oct 06 10:04:10 crc kubenswrapper[4610]: I1006 10:04:10.983214 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-4hldf_aa003bf3-ca26-468d-975a-5ceaa0361f14/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.016851 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-575v4_8c2d89eb-7d33-4268-901a-69b008f224d4/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.114249 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-7sghv_2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629/kube-rbac-proxy/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.232295 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-7sghv_2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.281843 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-6dqrl_8ef0dcc5-529c-4a68-ba57-c68198a73de0/kube-rbac-proxy/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.330872 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-6dqrl_8ef0dcc5-529c-4a68-ba57-c68198a73de0/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.447259 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-852df_0dfb923d-89c5-4fd0-af84-b73494c4cfc2/kube-rbac-proxy/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.487536 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-852df_0dfb923d-89c5-4fd0-af84-b73494c4cfc2/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.671972 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-n7mj5_54de0bca-8a80-49a0-ae9f-0fe13fdeda11/kube-rbac-proxy/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.749086 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-n7mj5_54de0bca-8a80-49a0-ae9f-0fe13fdeda11/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.805780 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6h5w4_95dcc684-207d-4745-949b-d2bd559b9f18/kube-rbac-proxy/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.929204 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6h5w4_95dcc684-207d-4745-949b-d2bd559b9f18/manager/0.log" Oct 06 10:04:11 crc kubenswrapper[4610]: I1006 10:04:11.952383 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-5nhb8_e30086d8-8211-4ef0-ae80-ec1d79719f51/kube-rbac-proxy/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.064271 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-5nhb8_e30086d8-8211-4ef0-ae80-ec1d79719f51/manager/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.161637 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l_cc6ec685-6841-44c5-8315-462e605aa2d0/manager/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.165775 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l_cc6ec685-6841-44c5-8315-462e605aa2d0/kube-rbac-proxy/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.314931 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-669d7f654d-zkg2w_533cbdde-bc4c-43b3-a9dd-e72d9b1aba90/kube-rbac-proxy/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.488924 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6497dff45c-kjs56_f88899ff-f714-4c64-8a83-bf97a4c80c1b/kube-rbac-proxy/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.800396 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6497dff45c-kjs56_f88899ff-f714-4c64-8a83-bf97a4c80c1b/operator/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.803194 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qh9tx_02ca177f-d4f8-419b-babe-caeb9a7272fe/registry-server/0.log" Oct 06 10:04:12 crc kubenswrapper[4610]: I1006 10:04:12.954916 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-j9k2v_10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95/kube-rbac-proxy/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.144762 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-x4rkd_ce2175a4-fac2-4259-91c9-6857fabd2755/kube-rbac-proxy/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.167427 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-j9k2v_10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95/manager/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.209314 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-x4rkd_ce2175a4-fac2-4259-91c9-6857fabd2755/manager/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.335250 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-669d7f654d-zkg2w_533cbdde-bc4c-43b3-a9dd-e72d9b1aba90/manager/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.368500 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d_ae310e32-abf5-4646-a09d-bbf21cd33dc6/operator/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.479523 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-twqvt_becf25ed-9d23-4cfa-afe3-7301d5476a7d/kube-rbac-proxy/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.552646 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-twqvt_becf25ed-9d23-4cfa-afe3-7301d5476a7d/manager/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.584039 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-df2ht_f40be14e-8191-4b07-8f45-01a5d18ac504/kube-rbac-proxy/0.log" Oct 06 10:04:13 crc kubenswrapper[4610]: I1006 10:04:13.670810 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-df2ht_f40be14e-8191-4b07-8f45-01a5d18ac504/manager/0.log" Oct 06 10:04:14 crc kubenswrapper[4610]: I1006 10:04:14.093585 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-sp9cd_15cb4fda-d42c-4ce7-a195-8476f589676e/kube-rbac-proxy/0.log" Oct 06 10:04:14 crc kubenswrapper[4610]: I1006 10:04:14.117218 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-sp9cd_15cb4fda-d42c-4ce7-a195-8476f589676e/manager/0.log" Oct 06 10:04:14 crc kubenswrapper[4610]: I1006 10:04:14.118615 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-9lbhh_2296f857-2cd2-45d3-907c-94e9eb4262ab/kube-rbac-proxy/0.log" Oct 06 10:04:14 crc kubenswrapper[4610]: I1006 10:04:14.299092 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-9lbhh_2296f857-2cd2-45d3-907c-94e9eb4262ab/manager/0.log" Oct 06 10:04:16 crc kubenswrapper[4610]: I1006 10:04:16.469127 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:04:16 crc kubenswrapper[4610]: I1006 10:04:16.469623 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:04:32 crc kubenswrapper[4610]: I1006 10:04:32.461806 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hljsk_351aa4d4-e29f-4405-9985-5953396ca08e/control-plane-machine-set-operator/0.log" Oct 06 10:04:32 crc kubenswrapper[4610]: I1006 10:04:32.662487 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9g4kq_db79ee81-c008-4374-9523-e762c47c9668/kube-rbac-proxy/0.log" Oct 06 10:04:32 crc kubenswrapper[4610]: I1006 10:04:32.687100 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9g4kq_db79ee81-c008-4374-9523-e762c47c9668/machine-api-operator/0.log" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.469582 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.470056 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.470100 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.470828 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.470876 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" gracePeriod=600 Oct 06 10:04:46 crc kubenswrapper[4610]: E1006 10:04:46.591585 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.893402 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" exitCode=0 Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.893441 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af"} Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.893491 4610 scope.go:117] "RemoveContainer" containerID="649a9565921e23e23c12e053e6001d7964396f792a9fdf0e7cd60160275ccef6" Oct 06 10:04:46 crc kubenswrapper[4610]: I1006 10:04:46.894110 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:04:46 crc kubenswrapper[4610]: E1006 10:04:46.894352 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:04:47 crc kubenswrapper[4610]: I1006 10:04:47.409935 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-pxwn5_66256ae8-d5ea-4800-85a5-5b61f7475b8e/cert-manager-controller/0.log" Oct 06 10:04:47 crc kubenswrapper[4610]: I1006 10:04:47.554100 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9rmxk_40b562f4-5aac-4a81-b2b9-7a449b662cfc/cert-manager-cainjector/0.log" Oct 06 10:04:47 crc kubenswrapper[4610]: I1006 10:04:47.584890 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zslmc_e7133e75-e1cc-410d-828b-18221c64707c/cert-manager-webhook/0.log" Oct 06 10:04:59 crc kubenswrapper[4610]: I1006 10:04:59.078682 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:04:59 crc kubenswrapper[4610]: E1006 10:04:59.079350 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:05:01 crc kubenswrapper[4610]: I1006 10:05:01.891780 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-tjvnk_9590cbb8-dcf7-4c56-a984-028943b510d5/nmstate-console-plugin/0.log" Oct 06 10:05:02 crc kubenswrapper[4610]: I1006 10:05:02.109621 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-47k9v_bd9ce7eb-b1fc-4636-93fc-d007702a746f/nmstate-handler/0.log" Oct 06 10:05:02 crc kubenswrapper[4610]: I1006 10:05:02.177000 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4x98k_77dcdec2-c766-467b-a369-11ca28c22ae7/kube-rbac-proxy/0.log" Oct 06 10:05:02 crc kubenswrapper[4610]: I1006 10:05:02.202431 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4x98k_77dcdec2-c766-467b-a369-11ca28c22ae7/nmstate-metrics/0.log" Oct 06 10:05:02 crc kubenswrapper[4610]: I1006 10:05:02.394827 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bzk4l_8ceb9be4-5b44-4da2-adb3-fcfca400d23a/nmstate-operator/0.log" Oct 06 10:05:02 crc kubenswrapper[4610]: I1006 10:05:02.474575 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-m8xhq_340ebace-99cf-4a2b-aaef-975b3480a795/nmstate-webhook/0.log" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.659620 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:11 crc kubenswrapper[4610]: E1006 10:05:11.660713 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4be2800-1a4b-4cd5-b9b6-886d2a123683" containerName="container-00" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.660729 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4be2800-1a4b-4cd5-b9b6-886d2a123683" containerName="container-00" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.662038 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4be2800-1a4b-4cd5-b9b6-886d2a123683" containerName="container-00" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.666104 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.689514 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.777749 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgvg\" (UniqueName: \"kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.778036 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.778119 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.884320 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.884373 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.884465 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgvg\" (UniqueName: \"kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.884947 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.885002 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.902851 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgvg\" (UniqueName: \"kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg\") pod \"certified-operators-pgm7d\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:11 crc kubenswrapper[4610]: I1006 10:05:11.987373 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:12 crc kubenswrapper[4610]: I1006 10:05:12.070392 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:05:12 crc kubenswrapper[4610]: E1006 10:05:12.070623 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:05:13 crc kubenswrapper[4610]: I1006 10:05:13.269551 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:14 crc kubenswrapper[4610]: I1006 10:05:14.258371 4610 generic.go:334] "Generic (PLEG): container finished" podID="886bce59-e8be-4837-85b5-b894f67de9ba" containerID="32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961" exitCode=0 Oct 06 10:05:14 crc kubenswrapper[4610]: I1006 10:05:14.258579 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerDied","Data":"32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961"} Oct 06 10:05:14 crc kubenswrapper[4610]: I1006 10:05:14.258604 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerStarted","Data":"cab7d3fa63e586b79e0942bbcf8ea1f055a5061c3ff799dab3c6ee2b02116c3e"} Oct 06 10:05:15 crc kubenswrapper[4610]: I1006 10:05:15.270435 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerStarted","Data":"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f"} Oct 06 10:05:17 crc kubenswrapper[4610]: I1006 10:05:17.286616 4610 generic.go:334] "Generic (PLEG): container finished" podID="886bce59-e8be-4837-85b5-b894f67de9ba" containerID="74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f" exitCode=0 Oct 06 10:05:17 crc kubenswrapper[4610]: I1006 10:05:17.286802 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerDied","Data":"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f"} Oct 06 10:05:18 crc kubenswrapper[4610]: I1006 10:05:18.297236 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerStarted","Data":"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3"} Oct 06 10:05:18 crc kubenswrapper[4610]: I1006 10:05:18.315998 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pgm7d" podStartSLOduration=3.877902321 podStartE2EDuration="7.315985133s" podCreationTimestamp="2025-10-06 10:05:11 +0000 UTC" firstStartedPulling="2025-10-06 10:05:14.267822981 +0000 UTC m=+5045.982876359" lastFinishedPulling="2025-10-06 10:05:17.705905783 +0000 UTC m=+5049.420959171" observedRunningTime="2025-10-06 10:05:18.312019769 +0000 UTC m=+5050.027073167" watchObservedRunningTime="2025-10-06 10:05:18.315985133 +0000 UTC m=+5050.031038521" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.389508 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-855xl_ddd30bb0-f54e-4aa2-81c2-f27b83aaf443/kube-rbac-proxy/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.482507 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-855xl_ddd30bb0-f54e-4aa2-81c2-f27b83aaf443/controller/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.691030 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.870377 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.905627 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.905638 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:05:19 crc kubenswrapper[4610]: I1006 10:05:19.946005 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.116061 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.142651 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.204591 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.269973 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.420405 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.451721 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/controller/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.458370 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.469623 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.645479 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/kube-rbac-proxy/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.656361 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/frr-metrics/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.669764 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/kube-rbac-proxy-frr/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.917452 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/reloader/0.log" Oct 06 10:05:20 crc kubenswrapper[4610]: I1006 10:05:20.996165 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-6pnsd_e0af20a6-573f-4421-b9a4-5d5005a855b8/frr-k8s-webhook-server/0.log" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.277036 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78b8b54fdd-fwfzv_4864fd4e-baeb-4b35-ae9b-b41f43515efd/manager/0.log" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.560619 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b496585dd-ndrsg_f2f355c4-bea3-46ef-b5bf-d7393c884ac1/webhook-server/0.log" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.638487 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nssl6_0585866f-da3e-4ab2-83a7-e0819349eb4d/kube-rbac-proxy/0.log" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.796646 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/frr/0.log" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.988144 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:21 crc kubenswrapper[4610]: I1006 10:05:21.988405 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.035781 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nssl6_0585866f-da3e-4ab2-83a7-e0819349eb4d/speaker/0.log" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.044305 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.366137 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.368351 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.379993 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.382361 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.489536 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.489893 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.490025 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjtt\" (UniqueName: \"kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.591472 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjtt\" (UniqueName: \"kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.591541 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.591620 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.592047 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.592173 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.610510 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjtt\" (UniqueName: \"kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt\") pod \"redhat-operators-xv59x\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:22 crc kubenswrapper[4610]: I1006 10:05:22.687117 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:23 crc kubenswrapper[4610]: I1006 10:05:23.226888 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:23 crc kubenswrapper[4610]: I1006 10:05:23.333011 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerStarted","Data":"116120d77d49d5ba36809c4be72513ac41fc300a8ded013574c6bff2a3a4c355"} Oct 06 10:05:24 crc kubenswrapper[4610]: I1006 10:05:24.070311 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:05:24 crc kubenswrapper[4610]: E1006 10:05:24.071721 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:05:24 crc kubenswrapper[4610]: I1006 10:05:24.345642 4610 generic.go:334] "Generic (PLEG): container finished" podID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerID="0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28" exitCode=0 Oct 06 10:05:24 crc kubenswrapper[4610]: I1006 10:05:24.345699 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerDied","Data":"0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28"} Oct 06 10:05:24 crc kubenswrapper[4610]: I1006 10:05:24.710436 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.354433 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pgm7d" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="registry-server" containerID="cri-o://c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3" gracePeriod=2 Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.835479 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.951977 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgvg\" (UniqueName: \"kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg\") pod \"886bce59-e8be-4837-85b5-b894f67de9ba\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.952229 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities\") pod \"886bce59-e8be-4837-85b5-b894f67de9ba\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.952316 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content\") pod \"886bce59-e8be-4837-85b5-b894f67de9ba\" (UID: \"886bce59-e8be-4837-85b5-b894f67de9ba\") " Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.953307 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities" (OuterVolumeSpecName: "utilities") pod "886bce59-e8be-4837-85b5-b894f67de9ba" (UID: "886bce59-e8be-4837-85b5-b894f67de9ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:05:25 crc kubenswrapper[4610]: I1006 10:05:25.966655 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg" (OuterVolumeSpecName: "kube-api-access-xsgvg") pod "886bce59-e8be-4837-85b5-b894f67de9ba" (UID: "886bce59-e8be-4837-85b5-b894f67de9ba"). InnerVolumeSpecName "kube-api-access-xsgvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.010901 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "886bce59-e8be-4837-85b5-b894f67de9ba" (UID: "886bce59-e8be-4837-85b5-b894f67de9ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.054933 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.054974 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886bce59-e8be-4837-85b5-b894f67de9ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.054988 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgvg\" (UniqueName: \"kubernetes.io/projected/886bce59-e8be-4837-85b5-b894f67de9ba-kube-api-access-xsgvg\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.366619 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerStarted","Data":"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a"} Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.369564 4610 generic.go:334] "Generic (PLEG): container finished" podID="886bce59-e8be-4837-85b5-b894f67de9ba" containerID="c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3" exitCode=0 Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.369611 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerDied","Data":"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3"} Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.369623 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgm7d" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.369671 4610 scope.go:117] "RemoveContainer" containerID="c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.369657 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgm7d" event={"ID":"886bce59-e8be-4837-85b5-b894f67de9ba","Type":"ContainerDied","Data":"cab7d3fa63e586b79e0942bbcf8ea1f055a5061c3ff799dab3c6ee2b02116c3e"} Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.395253 4610 scope.go:117] "RemoveContainer" containerID="74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.420122 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.428771 4610 scope.go:117] "RemoveContainer" containerID="32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.434714 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pgm7d"] Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.467667 4610 scope.go:117] "RemoveContainer" containerID="c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3" Oct 06 10:05:26 crc kubenswrapper[4610]: E1006 10:05:26.468236 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3\": container with ID starting with c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3 not found: ID does not exist" containerID="c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.468277 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3"} err="failed to get container status \"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3\": rpc error: code = NotFound desc = could not find container \"c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3\": container with ID starting with c630e6445aa79d989bb10c6f81009abf34eba48dd45a01984426db3f8af02bc3 not found: ID does not exist" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.468303 4610 scope.go:117] "RemoveContainer" containerID="74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f" Oct 06 10:05:26 crc kubenswrapper[4610]: E1006 10:05:26.468625 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f\": container with ID starting with 74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f not found: ID does not exist" containerID="74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.468764 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f"} err="failed to get container status \"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f\": rpc error: code = NotFound desc = could not find container \"74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f\": container with ID starting with 74c2dfb27b580ec3f355c66ac99e4296590922c9d7b9ce4100fb75b6df96609f not found: ID does not exist" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.469634 4610 scope.go:117] "RemoveContainer" containerID="32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961" Oct 06 10:05:26 crc kubenswrapper[4610]: E1006 10:05:26.469935 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961\": container with ID starting with 32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961 not found: ID does not exist" containerID="32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961" Oct 06 10:05:26 crc kubenswrapper[4610]: I1006 10:05:26.469991 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961"} err="failed to get container status \"32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961\": rpc error: code = NotFound desc = could not find container \"32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961\": container with ID starting with 32de6d9a5f01887db7bb37c1a3d3f0e5b198a9789ca1b3158b511bb63e94f961 not found: ID does not exist" Oct 06 10:05:27 crc kubenswrapper[4610]: I1006 10:05:27.081613 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" path="/var/lib/kubelet/pods/886bce59-e8be-4837-85b5-b894f67de9ba/volumes" Oct 06 10:05:28 crc kubenswrapper[4610]: I1006 10:05:28.391322 4610 generic.go:334] "Generic (PLEG): container finished" podID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerID="f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a" exitCode=0 Oct 06 10:05:28 crc kubenswrapper[4610]: I1006 10:05:28.393079 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerDied","Data":"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a"} Oct 06 10:05:29 crc kubenswrapper[4610]: I1006 10:05:29.403006 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerStarted","Data":"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94"} Oct 06 10:05:29 crc kubenswrapper[4610]: I1006 10:05:29.424009 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xv59x" podStartSLOduration=2.891222962 podStartE2EDuration="7.423991303s" podCreationTimestamp="2025-10-06 10:05:22 +0000 UTC" firstStartedPulling="2025-10-06 10:05:24.349110199 +0000 UTC m=+5056.064163597" lastFinishedPulling="2025-10-06 10:05:28.88187855 +0000 UTC m=+5060.596931938" observedRunningTime="2025-10-06 10:05:29.418886059 +0000 UTC m=+5061.133939447" watchObservedRunningTime="2025-10-06 10:05:29.423991303 +0000 UTC m=+5061.139044691" Oct 06 10:05:32 crc kubenswrapper[4610]: I1006 10:05:32.687504 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:32 crc kubenswrapper[4610]: I1006 10:05:32.687888 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:33 crc kubenswrapper[4610]: I1006 10:05:33.750201 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xv59x" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="registry-server" probeResult="failure" output=< Oct 06 10:05:33 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 10:05:33 crc kubenswrapper[4610]: > Oct 06 10:05:36 crc kubenswrapper[4610]: I1006 10:05:36.849400 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.103329 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.139148 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.167983 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.358655 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.425798 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.440666 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/extract/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.586506 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.809769 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.812115 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:05:37 crc kubenswrapper[4610]: I1006 10:05:37.872975 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.070658 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:05:38 crc kubenswrapper[4610]: E1006 10:05:38.070945 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.119415 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.127345 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.419928 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.883807 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/registry-server/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.893168 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.924390 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:05:38 crc kubenswrapper[4610]: I1006 10:05:38.955964 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.171602 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.194806 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.486752 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.857803 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.887763 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/registry-server/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.947422 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:05:39 crc kubenswrapper[4610]: I1006 10:05:39.947897 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.219499 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.221452 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.296589 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/extract/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.513823 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q9tl7_2f273f28-c469-4f2f-a0de-bad2dd1345cb/marketplace-operator/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.558785 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.766775 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.806059 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.837896 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.977842 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:05:40 crc kubenswrapper[4610]: I1006 10:05:40.982025 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.185243 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/registry-server/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.255000 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.404134 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.460364 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.464432 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.639875 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.655343 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:05:41 crc kubenswrapper[4610]: I1006 10:05:41.811624 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-utilities/0.log" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.139987 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/registry-server/0.log" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.571380 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-utilities/0.log" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.592776 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-content/0.log" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.621182 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-content/0.log" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.758083 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.908955 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:42 crc kubenswrapper[4610]: I1006 10:05:42.978646 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-content/0.log" Oct 06 10:05:43 crc kubenswrapper[4610]: I1006 10:05:43.001312 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:43 crc kubenswrapper[4610]: I1006 10:05:43.022673 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/extract-utilities/0.log" Oct 06 10:05:43 crc kubenswrapper[4610]: I1006 10:05:43.064525 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xv59x_e28d7cd8-0b86-4803-8f42-b15c0f8713eb/registry-server/0.log" Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.530514 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xv59x" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="registry-server" containerID="cri-o://57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94" gracePeriod=2 Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.960414 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.998131 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities\") pod \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.998169 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjtt\" (UniqueName: \"kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt\") pod \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.998274 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content\") pod \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\" (UID: \"e28d7cd8-0b86-4803-8f42-b15c0f8713eb\") " Oct 06 10:05:44 crc kubenswrapper[4610]: I1006 10:05:44.998853 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities" (OuterVolumeSpecName: "utilities") pod "e28d7cd8-0b86-4803-8f42-b15c0f8713eb" (UID: "e28d7cd8-0b86-4803-8f42-b15c0f8713eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.008233 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt" (OuterVolumeSpecName: "kube-api-access-vsjtt") pod "e28d7cd8-0b86-4803-8f42-b15c0f8713eb" (UID: "e28d7cd8-0b86-4803-8f42-b15c0f8713eb"). InnerVolumeSpecName "kube-api-access-vsjtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.070773 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e28d7cd8-0b86-4803-8f42-b15c0f8713eb" (UID: "e28d7cd8-0b86-4803-8f42-b15c0f8713eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.103416 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.103449 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjtt\" (UniqueName: \"kubernetes.io/projected/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-kube-api-access-vsjtt\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.103459 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28d7cd8-0b86-4803-8f42-b15c0f8713eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.540429 4610 generic.go:334] "Generic (PLEG): container finished" podID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerID="57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94" exitCode=0 Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.540473 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerDied","Data":"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94"} Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.540501 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv59x" event={"ID":"e28d7cd8-0b86-4803-8f42-b15c0f8713eb","Type":"ContainerDied","Data":"116120d77d49d5ba36809c4be72513ac41fc300a8ded013574c6bff2a3a4c355"} Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.540517 4610 scope.go:117] "RemoveContainer" containerID="57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.540603 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv59x" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.560531 4610 scope.go:117] "RemoveContainer" containerID="f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.583242 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.594338 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xv59x"] Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.594582 4610 scope.go:117] "RemoveContainer" containerID="0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.631236 4610 scope.go:117] "RemoveContainer" containerID="57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94" Oct 06 10:05:45 crc kubenswrapper[4610]: E1006 10:05:45.635209 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94\": container with ID starting with 57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94 not found: ID does not exist" containerID="57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.635260 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94"} err="failed to get container status \"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94\": rpc error: code = NotFound desc = could not find container \"57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94\": container with ID starting with 57cec4ee987d256a55d579ae40f4d0470366e854ecf889ef1c0d24c108cd7c94 not found: ID does not exist" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.635293 4610 scope.go:117] "RemoveContainer" containerID="f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a" Oct 06 10:05:45 crc kubenswrapper[4610]: E1006 10:05:45.635720 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a\": container with ID starting with f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a not found: ID does not exist" containerID="f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.635754 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a"} err="failed to get container status \"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a\": rpc error: code = NotFound desc = could not find container \"f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a\": container with ID starting with f6278444fa25b7423eb169506b83ed789c284142f60acb92ea709ae384e8062a not found: ID does not exist" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.635776 4610 scope.go:117] "RemoveContainer" containerID="0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28" Oct 06 10:05:45 crc kubenswrapper[4610]: E1006 10:05:45.636191 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28\": container with ID starting with 0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28 not found: ID does not exist" containerID="0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28" Oct 06 10:05:45 crc kubenswrapper[4610]: I1006 10:05:45.636213 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28"} err="failed to get container status \"0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28\": rpc error: code = NotFound desc = could not find container \"0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28\": container with ID starting with 0fb610d950b2096612fdbcb70cb78c1fb072543aac04da4aff8eb1544d90be28 not found: ID does not exist" Oct 06 10:05:47 crc kubenswrapper[4610]: I1006 10:05:47.082500 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" path="/var/lib/kubelet/pods/e28d7cd8-0b86-4803-8f42-b15c0f8713eb/volumes" Oct 06 10:05:51 crc kubenswrapper[4610]: I1006 10:05:51.069711 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:05:51 crc kubenswrapper[4610]: E1006 10:05:51.071248 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:06:06 crc kubenswrapper[4610]: I1006 10:06:06.071403 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:06:06 crc kubenswrapper[4610]: E1006 10:06:06.071955 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:06:19 crc kubenswrapper[4610]: I1006 10:06:19.085917 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:06:19 crc kubenswrapper[4610]: E1006 10:06:19.090791 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:06:32 crc kubenswrapper[4610]: I1006 10:06:32.071736 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:06:32 crc kubenswrapper[4610]: E1006 10:06:32.072628 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:06:46 crc kubenswrapper[4610]: I1006 10:06:46.070951 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:06:46 crc kubenswrapper[4610]: E1006 10:06:46.071870 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:06:57 crc kubenswrapper[4610]: I1006 10:06:57.071509 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:06:57 crc kubenswrapper[4610]: E1006 10:06:57.073377 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:07:09 crc kubenswrapper[4610]: I1006 10:07:09.089189 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:07:09 crc kubenswrapper[4610]: E1006 10:07:09.091128 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:07:20 crc kubenswrapper[4610]: I1006 10:07:20.071008 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:07:20 crc kubenswrapper[4610]: E1006 10:07:20.072823 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:07:35 crc kubenswrapper[4610]: I1006 10:07:35.071992 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:07:35 crc kubenswrapper[4610]: E1006 10:07:35.072799 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:07:49 crc kubenswrapper[4610]: I1006 10:07:49.086322 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:07:49 crc kubenswrapper[4610]: E1006 10:07:49.087449 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:08:04 crc kubenswrapper[4610]: I1006 10:08:04.071622 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:08:04 crc kubenswrapper[4610]: E1006 10:08:04.072528 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:08:15 crc kubenswrapper[4610]: I1006 10:08:15.070970 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:08:15 crc kubenswrapper[4610]: E1006 10:08:15.071899 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:08:16 crc kubenswrapper[4610]: I1006 10:08:16.297781 4610 scope.go:117] "RemoveContainer" containerID="e52e77a19dd9badac8c549d66f315b2f35d79c6725d0cddb882b748283693a18" Oct 06 10:08:17 crc kubenswrapper[4610]: I1006 10:08:17.271516 4610 generic.go:334] "Generic (PLEG): container finished" podID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerID="cf2095d28d96a94ae582c01e05818d4dfb4f119dbc2940f6442114b364ab6a4c" exitCode=0 Oct 06 10:08:17 crc kubenswrapper[4610]: I1006 10:08:17.271795 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" event={"ID":"b36935b4-d00d-4548-a6f6-4b838fa76ec1","Type":"ContainerDied","Data":"cf2095d28d96a94ae582c01e05818d4dfb4f119dbc2940f6442114b364ab6a4c"} Oct 06 10:08:17 crc kubenswrapper[4610]: I1006 10:08:17.272550 4610 scope.go:117] "RemoveContainer" containerID="cf2095d28d96a94ae582c01e05818d4dfb4f119dbc2940f6442114b364ab6a4c" Oct 06 10:08:17 crc kubenswrapper[4610]: I1006 10:08:17.929177 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wcrz5_must-gather-6jtdk_b36935b4-d00d-4548-a6f6-4b838fa76ec1/gather/0.log" Oct 06 10:08:26 crc kubenswrapper[4610]: I1006 10:08:26.070958 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:08:26 crc kubenswrapper[4610]: E1006 10:08:26.071745 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.141419 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wcrz5/must-gather-6jtdk"] Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.141821 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="copy" containerID="cri-o://5bb09b43df46ceb743a66b8ac6e677dc18b88543b00e41f3e2ddaa1d2a6fce1e" gracePeriod=2 Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.151670 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wcrz5/must-gather-6jtdk"] Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.414771 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wcrz5_must-gather-6jtdk_b36935b4-d00d-4548-a6f6-4b838fa76ec1/copy/0.log" Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.415346 4610 generic.go:334] "Generic (PLEG): container finished" podID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerID="5bb09b43df46ceb743a66b8ac6e677dc18b88543b00e41f3e2ddaa1d2a6fce1e" exitCode=143 Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.864621 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wcrz5_must-gather-6jtdk_b36935b4-d00d-4548-a6f6-4b838fa76ec1/copy/0.log" Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.864975 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.935540 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output\") pod \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.935588 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blhdf\" (UniqueName: \"kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf\") pod \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\" (UID: \"b36935b4-d00d-4548-a6f6-4b838fa76ec1\") " Oct 06 10:08:27 crc kubenswrapper[4610]: I1006 10:08:27.951023 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf" (OuterVolumeSpecName: "kube-api-access-blhdf") pod "b36935b4-d00d-4548-a6f6-4b838fa76ec1" (UID: "b36935b4-d00d-4548-a6f6-4b838fa76ec1"). InnerVolumeSpecName "kube-api-access-blhdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.038154 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blhdf\" (UniqueName: \"kubernetes.io/projected/b36935b4-d00d-4548-a6f6-4b838fa76ec1-kube-api-access-blhdf\") on node \"crc\" DevicePath \"\"" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.132644 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b36935b4-d00d-4548-a6f6-4b838fa76ec1" (UID: "b36935b4-d00d-4548-a6f6-4b838fa76ec1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.140640 4610 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b36935b4-d00d-4548-a6f6-4b838fa76ec1-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.422724 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wcrz5_must-gather-6jtdk_b36935b4-d00d-4548-a6f6-4b838fa76ec1/copy/0.log" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.423130 4610 scope.go:117] "RemoveContainer" containerID="5bb09b43df46ceb743a66b8ac6e677dc18b88543b00e41f3e2ddaa1d2a6fce1e" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.423167 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcrz5/must-gather-6jtdk" Oct 06 10:08:28 crc kubenswrapper[4610]: I1006 10:08:28.442946 4610 scope.go:117] "RemoveContainer" containerID="cf2095d28d96a94ae582c01e05818d4dfb4f119dbc2940f6442114b364ab6a4c" Oct 06 10:08:29 crc kubenswrapper[4610]: I1006 10:08:29.085304 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" path="/var/lib/kubelet/pods/b36935b4-d00d-4548-a6f6-4b838fa76ec1/volumes" Oct 06 10:08:38 crc kubenswrapper[4610]: I1006 10:08:38.070771 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:08:38 crc kubenswrapper[4610]: E1006 10:08:38.071773 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:08:52 crc kubenswrapper[4610]: I1006 10:08:52.070567 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:08:52 crc kubenswrapper[4610]: E1006 10:08:52.071547 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:09:05 crc kubenswrapper[4610]: I1006 10:09:05.070692 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:09:05 crc kubenswrapper[4610]: E1006 10:09:05.071683 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.626446 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgpxl/must-gather-nmc77"] Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631378 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631410 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631431 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="extract-content" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631439 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="extract-content" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631456 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="extract-utilities" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631464 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="extract-utilities" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631477 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="gather" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631483 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="gather" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631497 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="extract-content" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631503 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="extract-content" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631515 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631521 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631534 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="copy" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631540 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="copy" Oct 06 10:09:08 crc kubenswrapper[4610]: E1006 10:09:08.631552 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="extract-utilities" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631558 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="extract-utilities" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631799 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="gather" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631809 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28d7cd8-0b86-4803-8f42-b15c0f8713eb" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631830 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="886bce59-e8be-4837-85b5-b894f67de9ba" containerName="registry-server" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.631843 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36935b4-d00d-4548-a6f6-4b838fa76ec1" containerName="copy" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.632806 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.642283 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jgpxl"/"openshift-service-ca.crt" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.642484 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jgpxl"/"kube-root-ca.crt" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.666412 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jgpxl/must-gather-nmc77"] Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.686462 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.686617 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ffg\" (UniqueName: \"kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.788608 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ffg\" (UniqueName: \"kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.788696 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.789155 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.809882 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ffg\" (UniqueName: \"kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg\") pod \"must-gather-nmc77\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:08 crc kubenswrapper[4610]: I1006 10:09:08.958238 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:09:09 crc kubenswrapper[4610]: I1006 10:09:09.462232 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jgpxl/must-gather-nmc77"] Oct 06 10:09:09 crc kubenswrapper[4610]: I1006 10:09:09.857789 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/must-gather-nmc77" event={"ID":"9548ab5b-bf5e-46b1-a8be-f1020de63b13","Type":"ContainerStarted","Data":"c08c46ae012c644ac9b007567b0e2b6e0f1c23a2043c50c7a4c327757dc6187d"} Oct 06 10:09:09 crc kubenswrapper[4610]: I1006 10:09:09.858049 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/must-gather-nmc77" event={"ID":"9548ab5b-bf5e-46b1-a8be-f1020de63b13","Type":"ContainerStarted","Data":"8cd5e5e5bbfed4ecb396f88ca3d81f27beb56c7fd13e80cccf5f9770fedb8acb"} Oct 06 10:09:10 crc kubenswrapper[4610]: I1006 10:09:10.868643 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/must-gather-nmc77" event={"ID":"9548ab5b-bf5e-46b1-a8be-f1020de63b13","Type":"ContainerStarted","Data":"6b5faa4c49e9a9b2e20ead7754e3af8cb2067081f5e0deadddacf42b78a49207"} Oct 06 10:09:10 crc kubenswrapper[4610]: I1006 10:09:10.890218 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jgpxl/must-gather-nmc77" podStartSLOduration=2.890196227 podStartE2EDuration="2.890196227s" podCreationTimestamp="2025-10-06 10:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 10:09:10.881481869 +0000 UTC m=+5282.596535297" watchObservedRunningTime="2025-10-06 10:09:10.890196227 +0000 UTC m=+5282.605249615" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.148355 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-gplq8"] Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.150249 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.152764 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jgpxl"/"default-dockercfg-rkx9m" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.304991 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ffj\" (UniqueName: \"kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.305145 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.407541 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ffj\" (UniqueName: \"kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.407637 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.407757 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.428798 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ffj\" (UniqueName: \"kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj\") pod \"crc-debug-gplq8\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.473107 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:09:13 crc kubenswrapper[4610]: W1006 10:09:13.528217 4610 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eef8139_4227_4235_93b5_490212e67b15.slice/crio-5b9a22544b3a2c0b19cb7e6e2d995c711ec0eecea1efbdf3cd1884a08c708a9a WatchSource:0}: Error finding container 5b9a22544b3a2c0b19cb7e6e2d995c711ec0eecea1efbdf3cd1884a08c708a9a: Status 404 returned error can't find the container with id 5b9a22544b3a2c0b19cb7e6e2d995c711ec0eecea1efbdf3cd1884a08c708a9a Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.895114 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" event={"ID":"4eef8139-4227-4235-93b5-490212e67b15","Type":"ContainerStarted","Data":"f2b342c34b7c2ec3f1376251b40cfd4d470e3dd86fb339d83e16110c781bba0e"} Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.895433 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" event={"ID":"4eef8139-4227-4235-93b5-490212e67b15","Type":"ContainerStarted","Data":"5b9a22544b3a2c0b19cb7e6e2d995c711ec0eecea1efbdf3cd1884a08c708a9a"} Oct 06 10:09:13 crc kubenswrapper[4610]: I1006 10:09:13.907715 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" podStartSLOduration=0.907692638 podStartE2EDuration="907.692638ms" podCreationTimestamp="2025-10-06 10:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 10:09:13.906370223 +0000 UTC m=+5285.621423611" watchObservedRunningTime="2025-10-06 10:09:13.907692638 +0000 UTC m=+5285.622746036" Oct 06 10:09:16 crc kubenswrapper[4610]: I1006 10:09:16.070864 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:09:16 crc kubenswrapper[4610]: E1006 10:09:16.071548 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:09:27 crc kubenswrapper[4610]: I1006 10:09:27.070827 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:09:27 crc kubenswrapper[4610]: E1006 10:09:27.071739 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:09:38 crc kubenswrapper[4610]: I1006 10:09:38.069976 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:09:38 crc kubenswrapper[4610]: E1006 10:09:38.074757 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:09:50 crc kubenswrapper[4610]: I1006 10:09:50.071262 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:09:51 crc kubenswrapper[4610]: I1006 10:09:51.272106 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a"} Oct 06 10:10:16 crc kubenswrapper[4610]: I1006 10:10:16.396518 4610 scope.go:117] "RemoveContainer" containerID="0c383ae48c7413245fd1375dcc0e3080f2b36d823e000a0040c0bbdf567f1ba6" Oct 06 10:10:31 crc kubenswrapper[4610]: I1006 10:10:31.924392 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6849467754-2xpgn_8416ea1e-d79e-4dc2-8902-d59c8c4bbc60/barbican-api/0.log" Oct 06 10:10:31 crc kubenswrapper[4610]: I1006 10:10:31.966132 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6849467754-2xpgn_8416ea1e-d79e-4dc2-8902-d59c8c4bbc60/barbican-api-log/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.144544 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79f9c7c798-mtvls_9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d/barbican-keystone-listener/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.249828 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79f9c7c798-mtvls_9d82a949-dfd1-4b3e-8bc3-41c251fa4f3d/barbican-keystone-listener-log/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.381512 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776cb49575-gr6gq_8e13d8e7-f118-4d18-ab69-162dadc7f649/barbican-worker/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.481659 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776cb49575-gr6gq_8e13d8e7-f118-4d18-ab69-162dadc7f649/barbican-worker-log/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.682775 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-67nvv_290e102b-3121-4f44-b861-2b2e2e297f7b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.875624 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/ceilometer-central-agent/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.947805 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/proxy-httpd/0.log" Oct 06 10:10:32 crc kubenswrapper[4610]: I1006 10:10:32.957142 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/ceilometer-notification-agent/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.079521 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b6364c8-c83a-400c-85fb-52df075a07d4/sg-core/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.217872 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_279bb64b-8fba-4afc-9ded-6bd2375521ba/cinder-api/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.302929 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_279bb64b-8fba-4afc-9ded-6bd2375521ba/cinder-api-log/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.506944 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e29afc72-dbf0-453c-b96b-42d0399d6286/cinder-scheduler/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.603157 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e29afc72-dbf0-453c-b96b-42d0399d6286/probe/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.762817 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ngtb6_489d3203-794d-455a-b5b8-97f933b8db19/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:33 crc kubenswrapper[4610]: I1006 10:10:33.940206 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-687wd_b87a175d-5d06-4825-981c-ed2cf97fb652/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.098024 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jxqc6_e529d0f1-3da5-4178-8720-5769624f4490/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.245847 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/init/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.476150 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/init/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.639304 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667c9c995c-dfhd5_2228f60d-4cb6-43a2-9259-848d7353ad4b/dnsmasq-dns/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.708549 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gcsmz_460008b6-6b5b-43a1-b806-01340e52e472/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.883819 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_225e171a-3dd8-4d73-af22-fa01ef4a7359/glance-httpd/0.log" Oct 06 10:10:34 crc kubenswrapper[4610]: I1006 10:10:34.944349 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_225e171a-3dd8-4d73-af22-fa01ef4a7359/glance-log/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.101864 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ba3a861e-9618-4947-9e23-c285ec4d43a6/glance-log/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.145291 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ba3a861e-9618-4947-9e23-c285ec4d43a6/glance-httpd/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.473870 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868f4bc56b-f2np4_0843392c-2df1-4619-9745-21ca7d06a589/horizon/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.595099 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ps99p_7e49b85b-bbed-4c13-b513-3d61369aa3c0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.815121 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868f4bc56b-f2np4_0843392c-2df1-4619-9745-21ca7d06a589/horizon-log/0.log" Oct 06 10:10:35 crc kubenswrapper[4610]: I1006 10:10:35.840490 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rcwcf_b1e06674-8934-4170-9b16-5bf7292977ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:36 crc kubenswrapper[4610]: I1006 10:10:36.118077 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329081-rlm67_06b1f441-39d4-4c07-8696-045031364dd2/keystone-cron/0.log" Oct 06 10:10:36 crc kubenswrapper[4610]: I1006 10:10:36.416538 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-84759bdbdc-r6gkv_27ee29ca-3774-42c0-a3d0-164644f89e7d/keystone-api/0.log" Oct 06 10:10:36 crc kubenswrapper[4610]: I1006 10:10:36.479199 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cdd44fea-d46e-45e1-be47-89cc8a1f63c7/kube-state-metrics/0.log" Oct 06 10:10:36 crc kubenswrapper[4610]: I1006 10:10:36.621949 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9gsth_7198dacf-4e83-415a-a302-d543a7c2fea9/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:37 crc kubenswrapper[4610]: I1006 10:10:37.294732 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-hzmdn_4288043c-e9b4-4c1c-8234-3f44be6fbc2f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:37 crc kubenswrapper[4610]: I1006 10:10:37.517371 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69d948d6bf-n5vv6_f61e2bff-9119-4208-a7a0-c8da777e049b/neutron-httpd/0.log" Oct 06 10:10:37 crc kubenswrapper[4610]: I1006 10:10:37.533692 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69d948d6bf-n5vv6_f61e2bff-9119-4208-a7a0-c8da777e049b/neutron-api/0.log" Oct 06 10:10:38 crc kubenswrapper[4610]: I1006 10:10:38.402696 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb2657e1-8319-4a7d-be1f-a48d66bd5ba8/nova-cell0-conductor-conductor/0.log" Oct 06 10:10:38 crc kubenswrapper[4610]: I1006 10:10:38.976656 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49c7e6ea-9091-4658-bc36-3c82f6f25682/nova-api-log/0.log" Oct 06 10:10:39 crc kubenswrapper[4610]: I1006 10:10:39.067169 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7dd20d6-3e39-4ba9-8f1c-d336ed4ef992/nova-cell1-conductor-conductor/0.log" Oct 06 10:10:39 crc kubenswrapper[4610]: I1006 10:10:39.472219 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_760824cd-931b-4588-85d0-8b0548fc8c38/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 10:10:39 crc kubenswrapper[4610]: I1006 10:10:39.605401 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49c7e6ea-9091-4658-bc36-3c82f6f25682/nova-api-api/0.log" Oct 06 10:10:39 crc kubenswrapper[4610]: I1006 10:10:39.792314 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jxfdh_3ba105b9-3b48-4236-a86d-6fcded83393a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:40 crc kubenswrapper[4610]: I1006 10:10:39.999969 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_65ccdb5a-a886-4df8-9f4c-9bccb814641a/nova-metadata-log/0.log" Oct 06 10:10:40 crc kubenswrapper[4610]: I1006 10:10:40.655297 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/mysql-bootstrap/0.log" Oct 06 10:10:40 crc kubenswrapper[4610]: I1006 10:10:40.842163 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_272d54f3-da6d-4d44-b723-956ac2cc65a4/nova-scheduler-scheduler/0.log" Oct 06 10:10:40 crc kubenswrapper[4610]: I1006 10:10:40.883976 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/mysql-bootstrap/0.log" Oct 06 10:10:41 crc kubenswrapper[4610]: I1006 10:10:41.281777 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6faa6b22-87fb-46cf-93cf-0848f9f7ce06/galera/0.log" Oct 06 10:10:41 crc kubenswrapper[4610]: I1006 10:10:41.454780 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/mysql-bootstrap/0.log" Oct 06 10:10:41 crc kubenswrapper[4610]: I1006 10:10:41.747818 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/mysql-bootstrap/0.log" Oct 06 10:10:41 crc kubenswrapper[4610]: I1006 10:10:41.764874 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_21951fd5-4bf8-4851-b82f-874f75967f7c/galera/0.log" Oct 06 10:10:41 crc kubenswrapper[4610]: I1006 10:10:41.971199 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5b7473c8-fdfd-426a-99da-57bc4175e303/openstackclient/0.log" Oct 06 10:10:42 crc kubenswrapper[4610]: I1006 10:10:42.206471 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_65ccdb5a-a886-4df8-9f4c-9bccb814641a/nova-metadata-metadata/0.log" Oct 06 10:10:42 crc kubenswrapper[4610]: I1006 10:10:42.265467 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6hjff_1e77ce11-f629-48ab-820e-e67fbfc3ba57/ovn-controller/0.log" Oct 06 10:10:42 crc kubenswrapper[4610]: I1006 10:10:42.407557 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-62bzc_cfae3507-92ed-4d33-85d2-b5a0c3beed93/openstack-network-exporter/0.log" Oct 06 10:10:42 crc kubenswrapper[4610]: I1006 10:10:42.685084 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server-init/0.log" Oct 06 10:10:42 crc kubenswrapper[4610]: I1006 10:10:42.963851 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server-init/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.054854 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovs-vswitchd/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.066087 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pfhq5_478db756-12b3-40f7-b49c-49a548bdf337/ovsdb-server/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.311656 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gbnnk_e9eecc46-8a50-486b-ae37-2ba0f62b5216/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.580217 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f726409a-ab18-426c-84e7-2d8ae473a3d4/openstack-network-exporter/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.666805 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f726409a-ab18-426c-84e7-2d8ae473a3d4/ovn-northd/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.787884 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea778a76-1f2e-4289-8b2f-7ccc1975eb3d/openstack-network-exporter/0.log" Oct 06 10:10:43 crc kubenswrapper[4610]: I1006 10:10:43.854775 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea778a76-1f2e-4289-8b2f-7ccc1975eb3d/ovsdbserver-nb/0.log" Oct 06 10:10:44 crc kubenswrapper[4610]: I1006 10:10:44.040333 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f51717ef-7ac5-45b1-ae7c-beddba660645/openstack-network-exporter/0.log" Oct 06 10:10:44 crc kubenswrapper[4610]: I1006 10:10:44.145944 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f51717ef-7ac5-45b1-ae7c-beddba660645/ovsdbserver-sb/0.log" Oct 06 10:10:44 crc kubenswrapper[4610]: I1006 10:10:44.566154 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7659d8b-729ds_a417a1f5-8eba-4d85-9b6e-730463fe2734/placement-api/0.log" Oct 06 10:10:44 crc kubenswrapper[4610]: I1006 10:10:44.635319 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-b7659d8b-729ds_a417a1f5-8eba-4d85-9b6e-730463fe2734/placement-log/0.log" Oct 06 10:10:44 crc kubenswrapper[4610]: I1006 10:10:44.855393 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/setup-container/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.069947 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/rabbitmq/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.120268 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_099c0f32-ad2c-4b69-a308-f46f3dbab2be/setup-container/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.359726 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/setup-container/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.580879 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/setup-container/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.740368 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_060bb971-d347-44c3-b9ce-6c06c13bcb51/rabbitmq/0.log" Oct 06 10:10:45 crc kubenswrapper[4610]: I1006 10:10:45.933024 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fxgvw_f1784103-7612-4a23-9135-eb81df0fe2ce/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:46 crc kubenswrapper[4610]: I1006 10:10:46.084680 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2dstt_4dd4792e-84d5-41ff-bc84-b3d0bde5377a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:46 crc kubenswrapper[4610]: I1006 10:10:46.356669 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-pqzx9_f70ce47b-f642-41e9-8649-7dc466c07c27/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:46 crc kubenswrapper[4610]: I1006 10:10:46.545900 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fgvht_5c30c0cb-9027-4935-bca1-0debc398c091/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:46 crc kubenswrapper[4610]: I1006 10:10:46.754307 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4l44_ee7bec6c-22cc-448e-8939-798d80db2045/ssh-known-hosts-edpm-deployment/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.067702 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c65b98c55-xjdpw_2d4cceaf-e744-49da-a634-84401f61d862/proxy-httpd/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.114909 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c65b98c55-xjdpw_2d4cceaf-e744-49da-a634-84401f61d862/proxy-server/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.321206 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pv9bk_83179f37-2a3e-4b31-8d5e-fcdaf56961a5/swift-ring-rebalance/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.533951 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-auditor/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.627444 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-reaper/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.711295 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-replicator/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.807641 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/account-server/0.log" Oct 06 10:10:47 crc kubenswrapper[4610]: I1006 10:10:47.942966 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-auditor/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.053029 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-replicator/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.063996 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-server/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.225094 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/container-updater/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.338140 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-auditor/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.350876 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-expirer/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.475380 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-replicator/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.602934 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-server/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.676146 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/object-updater/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.701023 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/rsync/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.856251 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_05c553c8-ced7-4296-b8c5-12b91a953b1d/swift-recon-cron/0.log" Oct 06 10:10:48 crc kubenswrapper[4610]: I1006 10:10:48.967228 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jq9qd_a11ef1e8-ba4f-4b82-adad-cbe054665d4c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:10:49 crc kubenswrapper[4610]: I1006 10:10:49.266264 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6effef24-402a-46e6-a15a-02815ef810ae/tempest-tests-tempest-tests-runner/0.log" Oct 06 10:10:49 crc kubenswrapper[4610]: I1006 10:10:49.489325 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9a73cfa7-ef0b-4dda-9ca4-de80de751a61/test-operator-logs-container/0.log" Oct 06 10:10:49 crc kubenswrapper[4610]: I1006 10:10:49.622122 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zqmd5_d37ed6ae-3ad3-4604-9149-4e2b8006375e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 10:11:00 crc kubenswrapper[4610]: I1006 10:11:00.036635 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ef8e8806-0063-480d-933b-5a6c760d503e/memcached/0.log" Oct 06 10:11:19 crc kubenswrapper[4610]: I1006 10:11:19.020762 4610 generic.go:334] "Generic (PLEG): container finished" podID="4eef8139-4227-4235-93b5-490212e67b15" containerID="f2b342c34b7c2ec3f1376251b40cfd4d470e3dd86fb339d83e16110c781bba0e" exitCode=0 Oct 06 10:11:19 crc kubenswrapper[4610]: I1006 10:11:19.020848 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" event={"ID":"4eef8139-4227-4235-93b5-490212e67b15","Type":"ContainerDied","Data":"f2b342c34b7c2ec3f1376251b40cfd4d470e3dd86fb339d83e16110c781bba0e"} Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.134085 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.176166 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-gplq8"] Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.183203 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-gplq8"] Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.236594 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7ffj\" (UniqueName: \"kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj\") pod \"4eef8139-4227-4235-93b5-490212e67b15\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.236705 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host\") pod \"4eef8139-4227-4235-93b5-490212e67b15\" (UID: \"4eef8139-4227-4235-93b5-490212e67b15\") " Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.236819 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host" (OuterVolumeSpecName: "host") pod "4eef8139-4227-4235-93b5-490212e67b15" (UID: "4eef8139-4227-4235-93b5-490212e67b15"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.237296 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eef8139-4227-4235-93b5-490212e67b15-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.241388 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj" (OuterVolumeSpecName: "kube-api-access-h7ffj") pod "4eef8139-4227-4235-93b5-490212e67b15" (UID: "4eef8139-4227-4235-93b5-490212e67b15"). InnerVolumeSpecName "kube-api-access-h7ffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:11:20 crc kubenswrapper[4610]: I1006 10:11:20.339351 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7ffj\" (UniqueName: \"kubernetes.io/projected/4eef8139-4227-4235-93b5-490212e67b15-kube-api-access-h7ffj\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.044147 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b9a22544b3a2c0b19cb7e6e2d995c711ec0eecea1efbdf3cd1884a08c708a9a" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.044279 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-gplq8" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.084964 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eef8139-4227-4235-93b5-490212e67b15" path="/var/lib/kubelet/pods/4eef8139-4227-4235-93b5-490212e67b15/volumes" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.440319 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-cf6xt"] Oct 06 10:11:21 crc kubenswrapper[4610]: E1006 10:11:21.440981 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eef8139-4227-4235-93b5-490212e67b15" containerName="container-00" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.440996 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eef8139-4227-4235-93b5-490212e67b15" containerName="container-00" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.441720 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eef8139-4227-4235-93b5-490212e67b15" containerName="container-00" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.442496 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.444605 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jgpxl"/"default-dockercfg-rkx9m" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.563416 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgtv\" (UniqueName: \"kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.563455 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.665592 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgtv\" (UniqueName: \"kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.665660 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.665949 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.700925 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgtv\" (UniqueName: \"kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv\") pod \"crc-debug-cf6xt\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:21 crc kubenswrapper[4610]: I1006 10:11:21.769839 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:22 crc kubenswrapper[4610]: I1006 10:11:22.059750 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" event={"ID":"d63e091c-cb89-453f-861d-f50a1fa29f4c","Type":"ContainerStarted","Data":"c8fbd6e4fb3c70c75f12a58f23555712a7d062dab7cd003aa7dfc0a2dbc207d9"} Oct 06 10:11:23 crc kubenswrapper[4610]: I1006 10:11:23.071762 4610 generic.go:334] "Generic (PLEG): container finished" podID="d63e091c-cb89-453f-861d-f50a1fa29f4c" containerID="804ed9643ca948743c31222c8a8e22c1d6b026b12abc972c070f67d66c51578d" exitCode=0 Oct 06 10:11:23 crc kubenswrapper[4610]: I1006 10:11:23.085213 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" event={"ID":"d63e091c-cb89-453f-861d-f50a1fa29f4c","Type":"ContainerDied","Data":"804ed9643ca948743c31222c8a8e22c1d6b026b12abc972c070f67d66c51578d"} Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.183496 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.311450 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wgtv\" (UniqueName: \"kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv\") pod \"d63e091c-cb89-453f-861d-f50a1fa29f4c\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.311561 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host\") pod \"d63e091c-cb89-453f-861d-f50a1fa29f4c\" (UID: \"d63e091c-cb89-453f-861d-f50a1fa29f4c\") " Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.312143 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host" (OuterVolumeSpecName: "host") pod "d63e091c-cb89-453f-861d-f50a1fa29f4c" (UID: "d63e091c-cb89-453f-861d-f50a1fa29f4c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.349220 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv" (OuterVolumeSpecName: "kube-api-access-9wgtv") pod "d63e091c-cb89-453f-861d-f50a1fa29f4c" (UID: "d63e091c-cb89-453f-861d-f50a1fa29f4c"). InnerVolumeSpecName "kube-api-access-9wgtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.413179 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wgtv\" (UniqueName: \"kubernetes.io/projected/d63e091c-cb89-453f-861d-f50a1fa29f4c-kube-api-access-9wgtv\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:24 crc kubenswrapper[4610]: I1006 10:11:24.413211 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d63e091c-cb89-453f-861d-f50a1fa29f4c-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:25 crc kubenswrapper[4610]: I1006 10:11:25.086183 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" event={"ID":"d63e091c-cb89-453f-861d-f50a1fa29f4c","Type":"ContainerDied","Data":"c8fbd6e4fb3c70c75f12a58f23555712a7d062dab7cd003aa7dfc0a2dbc207d9"} Oct 06 10:11:25 crc kubenswrapper[4610]: I1006 10:11:25.086231 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8fbd6e4fb3c70c75f12a58f23555712a7d062dab7cd003aa7dfc0a2dbc207d9" Oct 06 10:11:25 crc kubenswrapper[4610]: I1006 10:11:25.086306 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-cf6xt" Oct 06 10:11:30 crc kubenswrapper[4610]: I1006 10:11:30.688228 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-cf6xt"] Oct 06 10:11:30 crc kubenswrapper[4610]: I1006 10:11:30.695436 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-cf6xt"] Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.082344 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63e091c-cb89-453f-861d-f50a1fa29f4c" path="/var/lib/kubelet/pods/d63e091c-cb89-453f-861d-f50a1fa29f4c/volumes" Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.896356 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-d7d97"] Oct 06 10:11:31 crc kubenswrapper[4610]: E1006 10:11:31.897533 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63e091c-cb89-453f-861d-f50a1fa29f4c" containerName="container-00" Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.897621 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63e091c-cb89-453f-861d-f50a1fa29f4c" containerName="container-00" Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.897890 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63e091c-cb89-453f-861d-f50a1fa29f4c" containerName="container-00" Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.898549 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:31 crc kubenswrapper[4610]: I1006 10:11:31.900693 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jgpxl"/"default-dockercfg-rkx9m" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.049194 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.049378 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fx7\" (UniqueName: \"kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.150834 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fx7\" (UniqueName: \"kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.150948 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.151117 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.174218 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fx7\" (UniqueName: \"kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7\") pod \"crc-debug-d7d97\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:32 crc kubenswrapper[4610]: I1006 10:11:32.220310 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:33 crc kubenswrapper[4610]: I1006 10:11:33.161196 4610 generic.go:334] "Generic (PLEG): container finished" podID="6640f602-9e8f-471a-ba56-c2e01ae7c629" containerID="e05aeb9368356069b9bab91fb433bd0f5c90b7e7b4666c6b40852d4f0a3e28d5" exitCode=0 Oct 06 10:11:33 crc kubenswrapper[4610]: I1006 10:11:33.161251 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" event={"ID":"6640f602-9e8f-471a-ba56-c2e01ae7c629","Type":"ContainerDied","Data":"e05aeb9368356069b9bab91fb433bd0f5c90b7e7b4666c6b40852d4f0a3e28d5"} Oct 06 10:11:33 crc kubenswrapper[4610]: I1006 10:11:33.161299 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" event={"ID":"6640f602-9e8f-471a-ba56-c2e01ae7c629","Type":"ContainerStarted","Data":"9a9f3b176370aee34dfb08152f4e62e5a05304bb599771a9e1c6e347e73081d7"} Oct 06 10:11:33 crc kubenswrapper[4610]: I1006 10:11:33.201069 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-d7d97"] Oct 06 10:11:33 crc kubenswrapper[4610]: I1006 10:11:33.208468 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgpxl/crc-debug-d7d97"] Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.335382 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.411289 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5fx7\" (UniqueName: \"kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7\") pod \"6640f602-9e8f-471a-ba56-c2e01ae7c629\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.411563 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host\") pod \"6640f602-9e8f-471a-ba56-c2e01ae7c629\" (UID: \"6640f602-9e8f-471a-ba56-c2e01ae7c629\") " Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.412136 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host" (OuterVolumeSpecName: "host") pod "6640f602-9e8f-471a-ba56-c2e01ae7c629" (UID: "6640f602-9e8f-471a-ba56-c2e01ae7c629"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.427866 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7" (OuterVolumeSpecName: "kube-api-access-d5fx7") pod "6640f602-9e8f-471a-ba56-c2e01ae7c629" (UID: "6640f602-9e8f-471a-ba56-c2e01ae7c629"). InnerVolumeSpecName "kube-api-access-d5fx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.513541 4610 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6640f602-9e8f-471a-ba56-c2e01ae7c629-host\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:34 crc kubenswrapper[4610]: I1006 10:11:34.513578 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5fx7\" (UniqueName: \"kubernetes.io/projected/6640f602-9e8f-471a-ba56-c2e01ae7c629-kube-api-access-d5fx7\") on node \"crc\" DevicePath \"\"" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.081458 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6640f602-9e8f-471a-ba56-c2e01ae7c629" path="/var/lib/kubelet/pods/6640f602-9e8f-471a-ba56-c2e01ae7c629/volumes" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.216460 4610 scope.go:117] "RemoveContainer" containerID="e05aeb9368356069b9bab91fb433bd0f5c90b7e7b4666c6b40852d4f0a3e28d5" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.216600 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/crc-debug-d7d97" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.232945 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-gcbb8_e8d37aed-cf46-47a0-a8ea-cfec57404966/manager/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.276812 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-gcbb8_e8d37aed-cf46-47a0-a8ea-cfec57404966/kube-rbac-proxy/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.403967 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.614575 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.651805 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.664400 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.818753 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/util/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.827658 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/pull/0.log" Oct 06 10:11:35 crc kubenswrapper[4610]: I1006 10:11:35.936863 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cc5c9d72985612613b60d673fcd9a5546868deebd03ee102aed9b84bcfwdlnm_a7c9f18a-e16b-45ec-9d46-e879df2773ab/extract/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.021269 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-65ffb_590d1736-08ea-4b24-9462-51e4f9eb2169/manager/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.028429 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-65ffb_590d1736-08ea-4b24-9462-51e4f9eb2169/kube-rbac-proxy/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.157340 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-f6dh9_23749a1a-8450-4412-850b-1e044d290c69/kube-rbac-proxy/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.251172 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-f6dh9_23749a1a-8450-4412-850b-1e044d290c69/manager/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.365985 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-z448l_55d8474b-1188-4617-abe4-d5e45d9a85cb/kube-rbac-proxy/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.505195 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-z448l_55d8474b-1188-4617-abe4-d5e45d9a85cb/manager/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.577227 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-xfwfl_08b5e994-103b-40ba-aef6-4dd36e5c456e/kube-rbac-proxy/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.600376 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-xfwfl_08b5e994-103b-40ba-aef6-4dd36e5c456e/manager/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.740735 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-cqqbc_b63b18e4-4aee-4a86-a5cb-23393a3cfaa3/kube-rbac-proxy/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.785853 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-cqqbc_b63b18e4-4aee-4a86-a5cb-23393a3cfaa3/manager/0.log" Oct 06 10:11:36 crc kubenswrapper[4610]: I1006 10:11:36.929195 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-4hldf_aa003bf3-ca26-468d-975a-5ceaa0361f14/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.142274 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-4hldf_aa003bf3-ca26-468d-975a-5ceaa0361f14/manager/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.147085 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-575v4_8c2d89eb-7d33-4268-901a-69b008f224d4/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.196866 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-575v4_8c2d89eb-7d33-4268-901a-69b008f224d4/manager/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.381227 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-7sghv_2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.476610 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-7sghv_2bcd4c17-2e7d-4a3f-91e1-e6542cb2e629/manager/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.644244 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-6dqrl_8ef0dcc5-529c-4a68-ba57-c68198a73de0/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.690165 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-852df_0dfb923d-89c5-4fd0-af84-b73494c4cfc2/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.719207 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-6dqrl_8ef0dcc5-529c-4a68-ba57-c68198a73de0/manager/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.851484 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-n7mj5_54de0bca-8a80-49a0-ae9f-0fe13fdeda11/kube-rbac-proxy/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.888850 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-852df_0dfb923d-89c5-4fd0-af84-b73494c4cfc2/manager/0.log" Oct 06 10:11:37 crc kubenswrapper[4610]: I1006 10:11:37.962581 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-n7mj5_54de0bca-8a80-49a0-ae9f-0fe13fdeda11/manager/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.189655 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6h5w4_95dcc684-207d-4745-949b-d2bd559b9f18/kube-rbac-proxy/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.249015 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6h5w4_95dcc684-207d-4745-949b-d2bd559b9f18/manager/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.354148 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-5nhb8_e30086d8-8211-4ef0-ae80-ec1d79719f51/kube-rbac-proxy/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.426841 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-5nhb8_e30086d8-8211-4ef0-ae80-ec1d79719f51/manager/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.485180 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l_cc6ec685-6841-44c5-8315-462e605aa2d0/kube-rbac-proxy/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.608994 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cdff8l_cc6ec685-6841-44c5-8315-462e605aa2d0/manager/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.696378 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-669d7f654d-zkg2w_533cbdde-bc4c-43b3-a9dd-e72d9b1aba90/kube-rbac-proxy/0.log" Oct 06 10:11:38 crc kubenswrapper[4610]: I1006 10:11:38.962846 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6497dff45c-kjs56_f88899ff-f714-4c64-8a83-bf97a4c80c1b/kube-rbac-proxy/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.137731 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6497dff45c-kjs56_f88899ff-f714-4c64-8a83-bf97a4c80c1b/operator/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.171247 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qh9tx_02ca177f-d4f8-419b-babe-caeb9a7272fe/registry-server/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.454907 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-j9k2v_10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95/kube-rbac-proxy/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.609144 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-j9k2v_10e10f08-ef5c-4b4e-8f14-f99f4d0ffb95/manager/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.750781 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-x4rkd_ce2175a4-fac2-4259-91c9-6857fabd2755/kube-rbac-proxy/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.769385 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-669d7f654d-zkg2w_533cbdde-bc4c-43b3-a9dd-e72d9b1aba90/manager/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.798802 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-x4rkd_ce2175a4-fac2-4259-91c9-6857fabd2755/manager/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.917245 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4qv4d_ae310e32-abf5-4646-a09d-bbf21cd33dc6/operator/0.log" Oct 06 10:11:39 crc kubenswrapper[4610]: I1006 10:11:39.955799 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-twqvt_becf25ed-9d23-4cfa-afe3-7301d5476a7d/manager/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.013230 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-twqvt_becf25ed-9d23-4cfa-afe3-7301d5476a7d/kube-rbac-proxy/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.092268 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-df2ht_f40be14e-8191-4b07-8f45-01a5d18ac504/kube-rbac-proxy/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.221313 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-df2ht_f40be14e-8191-4b07-8f45-01a5d18ac504/manager/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.260188 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-sp9cd_15cb4fda-d42c-4ce7-a195-8476f589676e/kube-rbac-proxy/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.301600 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-sp9cd_15cb4fda-d42c-4ce7-a195-8476f589676e/manager/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.406949 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-9lbhh_2296f857-2cd2-45d3-907c-94e9eb4262ab/kube-rbac-proxy/0.log" Oct 06 10:11:40 crc kubenswrapper[4610]: I1006 10:11:40.450660 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-9lbhh_2296f857-2cd2-45d3-907c-94e9eb4262ab/manager/0.log" Oct 06 10:11:57 crc kubenswrapper[4610]: I1006 10:11:57.470513 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hljsk_351aa4d4-e29f-4405-9985-5953396ca08e/control-plane-machine-set-operator/0.log" Oct 06 10:11:57 crc kubenswrapper[4610]: I1006 10:11:57.579919 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9g4kq_db79ee81-c008-4374-9523-e762c47c9668/kube-rbac-proxy/0.log" Oct 06 10:11:57 crc kubenswrapper[4610]: I1006 10:11:57.658888 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9g4kq_db79ee81-c008-4374-9523-e762c47c9668/machine-api-operator/0.log" Oct 06 10:12:03 crc kubenswrapper[4610]: I1006 10:12:03.992272 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:03 crc kubenswrapper[4610]: E1006 10:12:03.993456 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6640f602-9e8f-471a-ba56-c2e01ae7c629" containerName="container-00" Oct 06 10:12:03 crc kubenswrapper[4610]: I1006 10:12:03.993475 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="6640f602-9e8f-471a-ba56-c2e01ae7c629" containerName="container-00" Oct 06 10:12:03 crc kubenswrapper[4610]: I1006 10:12:03.993781 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="6640f602-9e8f-471a-ba56-c2e01ae7c629" containerName="container-00" Oct 06 10:12:03 crc kubenswrapper[4610]: I1006 10:12:03.995504 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.014508 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.114430 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.114511 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwxg\" (UniqueName: \"kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.114596 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.216731 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.216858 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.216898 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwxg\" (UniqueName: \"kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.217563 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.217793 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.239757 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwxg\" (UniqueName: \"kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg\") pod \"community-operators-xn5xz\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:04 crc kubenswrapper[4610]: I1006 10:12:04.318624 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:06 crc kubenswrapper[4610]: I1006 10:12:06.252919 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:06 crc kubenswrapper[4610]: I1006 10:12:06.580316 4610 generic.go:334] "Generic (PLEG): container finished" podID="3aabadf0-0664-4a2e-a233-403385fad464" containerID="eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a" exitCode=0 Oct 06 10:12:06 crc kubenswrapper[4610]: I1006 10:12:06.580650 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerDied","Data":"eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a"} Oct 06 10:12:06 crc kubenswrapper[4610]: I1006 10:12:06.580703 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerStarted","Data":"70ef8a124285b9c44168db9583599d403b2857b198bf30da17c308cf4b9a3d09"} Oct 06 10:12:06 crc kubenswrapper[4610]: I1006 10:12:06.582343 4610 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 10:12:07 crc kubenswrapper[4610]: I1006 10:12:07.590278 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerStarted","Data":"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224"} Oct 06 10:12:09 crc kubenswrapper[4610]: I1006 10:12:09.612779 4610 generic.go:334] "Generic (PLEG): container finished" podID="3aabadf0-0664-4a2e-a233-403385fad464" containerID="561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224" exitCode=0 Oct 06 10:12:09 crc kubenswrapper[4610]: I1006 10:12:09.612896 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerDied","Data":"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224"} Oct 06 10:12:10 crc kubenswrapper[4610]: I1006 10:12:10.624321 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerStarted","Data":"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd"} Oct 06 10:12:10 crc kubenswrapper[4610]: I1006 10:12:10.647052 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xn5xz" podStartSLOduration=4.220524286 podStartE2EDuration="7.647019168s" podCreationTimestamp="2025-10-06 10:12:03 +0000 UTC" firstStartedPulling="2025-10-06 10:12:06.582126274 +0000 UTC m=+5458.297179662" lastFinishedPulling="2025-10-06 10:12:10.008621146 +0000 UTC m=+5461.723674544" observedRunningTime="2025-10-06 10:12:10.64248005 +0000 UTC m=+5462.357533468" watchObservedRunningTime="2025-10-06 10:12:10.647019168 +0000 UTC m=+5462.362072556" Oct 06 10:12:12 crc kubenswrapper[4610]: I1006 10:12:12.603322 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-pxwn5_66256ae8-d5ea-4800-85a5-5b61f7475b8e/cert-manager-controller/0.log" Oct 06 10:12:12 crc kubenswrapper[4610]: I1006 10:12:12.831936 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9rmxk_40b562f4-5aac-4a81-b2b9-7a449b662cfc/cert-manager-cainjector/0.log" Oct 06 10:12:13 crc kubenswrapper[4610]: I1006 10:12:13.137697 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zslmc_e7133e75-e1cc-410d-828b-18221c64707c/cert-manager-webhook/0.log" Oct 06 10:12:14 crc kubenswrapper[4610]: I1006 10:12:14.319678 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:14 crc kubenswrapper[4610]: I1006 10:12:14.319769 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:14 crc kubenswrapper[4610]: I1006 10:12:14.566105 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:16 crc kubenswrapper[4610]: I1006 10:12:16.469567 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:12:16 crc kubenswrapper[4610]: I1006 10:12:16.469967 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:12:24 crc kubenswrapper[4610]: I1006 10:12:24.378786 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:24 crc kubenswrapper[4610]: I1006 10:12:24.428084 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:24 crc kubenswrapper[4610]: I1006 10:12:24.756033 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xn5xz" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="registry-server" containerID="cri-o://36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd" gracePeriod=2 Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.315211 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.357797 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content\") pod \"3aabadf0-0664-4a2e-a233-403385fad464\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.357890 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwxg\" (UniqueName: \"kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg\") pod \"3aabadf0-0664-4a2e-a233-403385fad464\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.358024 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities\") pod \"3aabadf0-0664-4a2e-a233-403385fad464\" (UID: \"3aabadf0-0664-4a2e-a233-403385fad464\") " Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.358915 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities" (OuterVolumeSpecName: "utilities") pod "3aabadf0-0664-4a2e-a233-403385fad464" (UID: "3aabadf0-0664-4a2e-a233-403385fad464"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.381429 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg" (OuterVolumeSpecName: "kube-api-access-sxwxg") pod "3aabadf0-0664-4a2e-a233-403385fad464" (UID: "3aabadf0-0664-4a2e-a233-403385fad464"). InnerVolumeSpecName "kube-api-access-sxwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.404974 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aabadf0-0664-4a2e-a233-403385fad464" (UID: "3aabadf0-0664-4a2e-a233-403385fad464"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.460172 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.460355 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwxg\" (UniqueName: \"kubernetes.io/projected/3aabadf0-0664-4a2e-a233-403385fad464-kube-api-access-sxwxg\") on node \"crc\" DevicePath \"\"" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.460455 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aabadf0-0664-4a2e-a233-403385fad464-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.769349 4610 generic.go:334] "Generic (PLEG): container finished" podID="3aabadf0-0664-4a2e-a233-403385fad464" containerID="36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd" exitCode=0 Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.769398 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerDied","Data":"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd"} Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.769397 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn5xz" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.769433 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn5xz" event={"ID":"3aabadf0-0664-4a2e-a233-403385fad464","Type":"ContainerDied","Data":"70ef8a124285b9c44168db9583599d403b2857b198bf30da17c308cf4b9a3d09"} Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.769451 4610 scope.go:117] "RemoveContainer" containerID="36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.825643 4610 scope.go:117] "RemoveContainer" containerID="561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.831131 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.881386 4610 scope.go:117] "RemoveContainer" containerID="eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.887803 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xn5xz"] Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.928322 4610 scope.go:117] "RemoveContainer" containerID="36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd" Oct 06 10:12:25 crc kubenswrapper[4610]: E1006 10:12:25.928875 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd\": container with ID starting with 36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd not found: ID does not exist" containerID="36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.928974 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd"} err="failed to get container status \"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd\": rpc error: code = NotFound desc = could not find container \"36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd\": container with ID starting with 36460456f240819665f0c92c6c459e6a7dbe873302fb543f369c3342567d99bd not found: ID does not exist" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.929082 4610 scope.go:117] "RemoveContainer" containerID="561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224" Oct 06 10:12:25 crc kubenswrapper[4610]: E1006 10:12:25.929530 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224\": container with ID starting with 561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224 not found: ID does not exist" containerID="561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.929602 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224"} err="failed to get container status \"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224\": rpc error: code = NotFound desc = could not find container \"561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224\": container with ID starting with 561a8e0e2a85fdf0d7e3b4e49ad161e5312824264cf1c5b1c0c89a3ee8b44224 not found: ID does not exist" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.929664 4610 scope.go:117] "RemoveContainer" containerID="eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a" Oct 06 10:12:25 crc kubenswrapper[4610]: E1006 10:12:25.929941 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a\": container with ID starting with eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a not found: ID does not exist" containerID="eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a" Oct 06 10:12:25 crc kubenswrapper[4610]: I1006 10:12:25.929975 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a"} err="failed to get container status \"eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a\": rpc error: code = NotFound desc = could not find container \"eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a\": container with ID starting with eac2aa76fc5bd29edb660b36807a15f8881c8ec175dd0565033fee6509c1ce6a not found: ID does not exist" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.361245 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-tjvnk_9590cbb8-dcf7-4c56-a984-028943b510d5/nmstate-console-plugin/0.log" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.613250 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4x98k_77dcdec2-c766-467b-a369-11ca28c22ae7/kube-rbac-proxy/0.log" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.641524 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-47k9v_bd9ce7eb-b1fc-4636-93fc-d007702a746f/nmstate-handler/0.log" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.721600 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4x98k_77dcdec2-c766-467b-a369-11ca28c22ae7/nmstate-metrics/0.log" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.781508 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bzk4l_8ceb9be4-5b44-4da2-adb3-fcfca400d23a/nmstate-operator/0.log" Oct 06 10:12:26 crc kubenswrapper[4610]: I1006 10:12:26.981106 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-m8xhq_340ebace-99cf-4a2b-aaef-975b3480a795/nmstate-webhook/0.log" Oct 06 10:12:27 crc kubenswrapper[4610]: I1006 10:12:27.081635 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aabadf0-0664-4a2e-a233-403385fad464" path="/var/lib/kubelet/pods/3aabadf0-0664-4a2e-a233-403385fad464/volumes" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.263057 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-855xl_ddd30bb0-f54e-4aa2-81c2-f27b83aaf443/kube-rbac-proxy/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.389910 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-855xl_ddd30bb0-f54e-4aa2-81c2-f27b83aaf443/controller/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.506293 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.677825 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.698815 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.756654 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:12:41 crc kubenswrapper[4610]: I1006 10:12:41.781313 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.031962 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.042488 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.043211 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.050215 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.315699 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-reloader/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.330417 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-frr-files/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.339078 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/controller/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.348414 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/cp-metrics/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.566622 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/frr-metrics/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.617847 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/kube-rbac-proxy/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.653343 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/kube-rbac-proxy-frr/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.818607 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/reloader/0.log" Oct 06 10:12:42 crc kubenswrapper[4610]: I1006 10:12:42.987567 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-6pnsd_e0af20a6-573f-4421-b9a4-5d5005a855b8/frr-k8s-webhook-server/0.log" Oct 06 10:12:43 crc kubenswrapper[4610]: I1006 10:12:43.207772 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78b8b54fdd-fwfzv_4864fd4e-baeb-4b35-ae9b-b41f43515efd/manager/0.log" Oct 06 10:12:43 crc kubenswrapper[4610]: I1006 10:12:43.475895 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b496585dd-ndrsg_f2f355c4-bea3-46ef-b5bf-d7393c884ac1/webhook-server/0.log" Oct 06 10:12:43 crc kubenswrapper[4610]: I1006 10:12:43.575464 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nssl6_0585866f-da3e-4ab2-83a7-e0819349eb4d/kube-rbac-proxy/0.log" Oct 06 10:12:43 crc kubenswrapper[4610]: I1006 10:12:43.817638 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rmk9w_0f88e64c-929a-4a97-a3a1-a92face17060/frr/0.log" Oct 06 10:12:44 crc kubenswrapper[4610]: I1006 10:12:44.071836 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nssl6_0585866f-da3e-4ab2-83a7-e0819349eb4d/speaker/0.log" Oct 06 10:12:46 crc kubenswrapper[4610]: I1006 10:12:46.468731 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:12:46 crc kubenswrapper[4610]: I1006 10:12:46.469109 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.395003 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.484923 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.544522 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.592222 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.740791 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/util/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.792461 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/extract/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.805540 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tgwhk_9e1e51cb-d7f6-4b8b-8c1c-46c166179994/pull/0.log" Oct 06 10:12:58 crc kubenswrapper[4610]: I1006 10:12:58.948939 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.281907 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.297262 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.346804 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.470782 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-utilities/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.602788 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/extract-content/0.log" Oct 06 10:12:59 crc kubenswrapper[4610]: I1006 10:12:59.780403 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.071084 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.128065 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.197081 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.251610 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qrngf_bee7def6-3268-4497-b20c-c0133ade55de/registry-server/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.740342 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-content/0.log" Oct 06 10:13:00 crc kubenswrapper[4610]: I1006 10:13:00.746074 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/extract-utilities/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.163226 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.325819 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.326517 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.432147 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f22rd_a67f97c9-f65d-4818-9b7d-568ab33ac02f/registry-server/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.443350 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.651363 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/extract/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.668504 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/util/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.691241 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cndprt_97e4f094-7f15-4140-b0dd-10f545a9fef3/pull/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.877115 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q9tl7_2f273f28-c469-4f2f-a0de-bad2dd1345cb/marketplace-operator/0.log" Oct 06 10:13:01 crc kubenswrapper[4610]: I1006 10:13:01.972053 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.163123 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.202474 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.211259 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.345818 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-content/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.420802 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.472324 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.524865 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trtt8_1db48478-61a7-46e8-87f2-7c4201194e49/registry-server/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.760739 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.761812 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.816809 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.981819 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-utilities/0.log" Oct 06 10:13:02 crc kubenswrapper[4610]: I1006 10:13:02.998573 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/extract-content/0.log" Oct 06 10:13:03 crc kubenswrapper[4610]: I1006 10:13:03.490876 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-x6x4n_13613787-1366-4ea9-8add-d39428f1514f/registry-server/0.log" Oct 06 10:13:16 crc kubenswrapper[4610]: I1006 10:13:16.468855 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:13:16 crc kubenswrapper[4610]: I1006 10:13:16.469516 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:13:16 crc kubenswrapper[4610]: I1006 10:13:16.469581 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 10:13:16 crc kubenswrapper[4610]: I1006 10:13:16.470459 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 10:13:16 crc kubenswrapper[4610]: I1006 10:13:16.470520 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a" gracePeriod=600 Oct 06 10:13:17 crc kubenswrapper[4610]: I1006 10:13:17.257036 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a" exitCode=0 Oct 06 10:13:17 crc kubenswrapper[4610]: I1006 10:13:17.257318 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a"} Oct 06 10:13:17 crc kubenswrapper[4610]: I1006 10:13:17.257344 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerStarted","Data":"7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623"} Oct 06 10:13:17 crc kubenswrapper[4610]: I1006 10:13:17.257359 4610 scope.go:117] "RemoveContainer" containerID="22aa7253285b5ba78349515cee2f012202c093ea2b92105c452b0a9b078032af" Oct 06 10:13:24 crc kubenswrapper[4610]: E1006 10:13:24.075348 4610 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.21:36482->38.129.56.21:42153: write tcp 38.129.56.21:36482->38.129.56.21:42153: write: connection reset by peer Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.180253 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89"] Oct 06 10:15:00 crc kubenswrapper[4610]: E1006 10:15:00.181124 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="registry-server" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.181140 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="registry-server" Oct 06 10:15:00 crc kubenswrapper[4610]: E1006 10:15:00.181173 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="extract-content" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.181181 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="extract-content" Oct 06 10:15:00 crc kubenswrapper[4610]: E1006 10:15:00.181199 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="extract-utilities" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.181208 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="extract-utilities" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.181440 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aabadf0-0664-4a2e-a233-403385fad464" containerName="registry-server" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.182210 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.185744 4610 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.192708 4610 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.198349 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89"] Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.267339 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4l5\" (UniqueName: \"kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.267384 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.267408 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.369127 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4l5\" (UniqueName: \"kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.369212 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.369255 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.370633 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.383809 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.409838 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4l5\" (UniqueName: \"kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5\") pod \"collect-profiles-29329095-8cr89\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:00 crc kubenswrapper[4610]: I1006 10:15:00.519234 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:01 crc kubenswrapper[4610]: I1006 10:15:01.299865 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89"] Oct 06 10:15:01 crc kubenswrapper[4610]: I1006 10:15:01.420926 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" event={"ID":"894de10d-ab8d-4aa4-b25a-46b49cca29c9","Type":"ContainerStarted","Data":"dc03a4c637598e54b97c00ae53adf5ba17872b93c03742dc150a27dcd60962df"} Oct 06 10:15:02 crc kubenswrapper[4610]: I1006 10:15:02.438038 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" event={"ID":"894de10d-ab8d-4aa4-b25a-46b49cca29c9","Type":"ContainerStarted","Data":"08c678e4548c3b87b8beafb395e874234da3005715cbb95103322aa566eb5091"} Oct 06 10:15:02 crc kubenswrapper[4610]: I1006 10:15:02.464112 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" podStartSLOduration=2.464091441 podStartE2EDuration="2.464091441s" podCreationTimestamp="2025-10-06 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 10:15:02.458288009 +0000 UTC m=+5634.173341397" watchObservedRunningTime="2025-10-06 10:15:02.464091441 +0000 UTC m=+5634.179144859" Oct 06 10:15:03 crc kubenswrapper[4610]: I1006 10:15:03.448966 4610 generic.go:334] "Generic (PLEG): container finished" podID="894de10d-ab8d-4aa4-b25a-46b49cca29c9" containerID="08c678e4548c3b87b8beafb395e874234da3005715cbb95103322aa566eb5091" exitCode=0 Oct 06 10:15:03 crc kubenswrapper[4610]: I1006 10:15:03.449026 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" event={"ID":"894de10d-ab8d-4aa4-b25a-46b49cca29c9","Type":"ContainerDied","Data":"08c678e4548c3b87b8beafb395e874234da3005715cbb95103322aa566eb5091"} Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.813720 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.859948 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume\") pod \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.860021 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw4l5\" (UniqueName: \"kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5\") pod \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.860065 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume\") pod \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\" (UID: \"894de10d-ab8d-4aa4-b25a-46b49cca29c9\") " Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.860956 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume" (OuterVolumeSpecName: "config-volume") pod "894de10d-ab8d-4aa4-b25a-46b49cca29c9" (UID: "894de10d-ab8d-4aa4-b25a-46b49cca29c9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.866228 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5" (OuterVolumeSpecName: "kube-api-access-pw4l5") pod "894de10d-ab8d-4aa4-b25a-46b49cca29c9" (UID: "894de10d-ab8d-4aa4-b25a-46b49cca29c9"). InnerVolumeSpecName "kube-api-access-pw4l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.867405 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "894de10d-ab8d-4aa4-b25a-46b49cca29c9" (UID: "894de10d-ab8d-4aa4-b25a-46b49cca29c9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.961663 4610 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894de10d-ab8d-4aa4-b25a-46b49cca29c9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.961705 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw4l5\" (UniqueName: \"kubernetes.io/projected/894de10d-ab8d-4aa4-b25a-46b49cca29c9-kube-api-access-pw4l5\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:04 crc kubenswrapper[4610]: I1006 10:15:04.961719 4610 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894de10d-ab8d-4aa4-b25a-46b49cca29c9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:05 crc kubenswrapper[4610]: I1006 10:15:05.474030 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" event={"ID":"894de10d-ab8d-4aa4-b25a-46b49cca29c9","Type":"ContainerDied","Data":"dc03a4c637598e54b97c00ae53adf5ba17872b93c03742dc150a27dcd60962df"} Oct 06 10:15:05 crc kubenswrapper[4610]: I1006 10:15:05.474079 4610 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc03a4c637598e54b97c00ae53adf5ba17872b93c03742dc150a27dcd60962df" Oct 06 10:15:05 crc kubenswrapper[4610]: I1006 10:15:05.474134 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329095-8cr89" Oct 06 10:15:05 crc kubenswrapper[4610]: I1006 10:15:05.549308 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56"] Oct 06 10:15:05 crc kubenswrapper[4610]: I1006 10:15:05.559117 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329050-drq56"] Oct 06 10:15:07 crc kubenswrapper[4610]: I1006 10:15:07.081715 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99ad092-7458-43c4-8848-19dd87947fef" path="/var/lib/kubelet/pods/c99ad092-7458-43c4-8848-19dd87947fef/volumes" Oct 06 10:15:16 crc kubenswrapper[4610]: I1006 10:15:16.469366 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:15:16 crc kubenswrapper[4610]: I1006 10:15:16.469831 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:15:16 crc kubenswrapper[4610]: I1006 10:15:16.688636 4610 scope.go:117] "RemoveContainer" containerID="11874c4fdd17966fee41a5826f0add624f484ea6d944074d69f81593e0209261" Oct 06 10:15:16 crc kubenswrapper[4610]: I1006 10:15:16.729636 4610 scope.go:117] "RemoveContainer" containerID="f2b342c34b7c2ec3f1376251b40cfd4d470e3dd86fb339d83e16110c781bba0e" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.321722 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:28 crc kubenswrapper[4610]: E1006 10:15:28.322423 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894de10d-ab8d-4aa4-b25a-46b49cca29c9" containerName="collect-profiles" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.322435 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="894de10d-ab8d-4aa4-b25a-46b49cca29c9" containerName="collect-profiles" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.322631 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="894de10d-ab8d-4aa4-b25a-46b49cca29c9" containerName="collect-profiles" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.323848 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.331482 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.490871 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.490923 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.491138 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7hz8\" (UniqueName: \"kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.592846 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.593135 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.593194 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7hz8\" (UniqueName: \"kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.593547 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.593722 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.616938 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7hz8\" (UniqueName: \"kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8\") pod \"certified-operators-xscgx\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:28 crc kubenswrapper[4610]: I1006 10:15:28.645064 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:29 crc kubenswrapper[4610]: I1006 10:15:29.204445 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:29 crc kubenswrapper[4610]: I1006 10:15:29.713383 4610 generic.go:334] "Generic (PLEG): container finished" podID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerID="6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a" exitCode=0 Oct 06 10:15:29 crc kubenswrapper[4610]: I1006 10:15:29.713482 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerDied","Data":"6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a"} Oct 06 10:15:29 crc kubenswrapper[4610]: I1006 10:15:29.713546 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerStarted","Data":"c49bf3a1acc4511c5972093f0d024c8d7db76ba82a7f4a9f2c8a04aba6a3e027"} Oct 06 10:15:30 crc kubenswrapper[4610]: I1006 10:15:30.727523 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerStarted","Data":"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7"} Oct 06 10:15:31 crc kubenswrapper[4610]: I1006 10:15:31.739446 4610 generic.go:334] "Generic (PLEG): container finished" podID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerID="16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7" exitCode=0 Oct 06 10:15:31 crc kubenswrapper[4610]: I1006 10:15:31.739498 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerDied","Data":"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7"} Oct 06 10:15:32 crc kubenswrapper[4610]: I1006 10:15:32.750557 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerStarted","Data":"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389"} Oct 06 10:15:32 crc kubenswrapper[4610]: I1006 10:15:32.771710 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xscgx" podStartSLOduration=2.226018908 podStartE2EDuration="4.7716917s" podCreationTimestamp="2025-10-06 10:15:28 +0000 UTC" firstStartedPulling="2025-10-06 10:15:29.719626219 +0000 UTC m=+5661.434679607" lastFinishedPulling="2025-10-06 10:15:32.265299001 +0000 UTC m=+5663.980352399" observedRunningTime="2025-10-06 10:15:32.766102014 +0000 UTC m=+5664.481155402" watchObservedRunningTime="2025-10-06 10:15:32.7716917 +0000 UTC m=+5664.486745088" Oct 06 10:15:38 crc kubenswrapper[4610]: I1006 10:15:38.645819 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:38 crc kubenswrapper[4610]: I1006 10:15:38.648628 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:38 crc kubenswrapper[4610]: I1006 10:15:38.699384 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:38 crc kubenswrapper[4610]: I1006 10:15:38.861807 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:38 crc kubenswrapper[4610]: I1006 10:15:38.945013 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:40 crc kubenswrapper[4610]: I1006 10:15:40.843224 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xscgx" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="registry-server" containerID="cri-o://d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389" gracePeriod=2 Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.301584 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.389706 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities\") pod \"593c2553-a355-4578-b8e5-7a96e92a7fe8\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.390194 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content\") pod \"593c2553-a355-4578-b8e5-7a96e92a7fe8\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.390872 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7hz8\" (UniqueName: \"kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8\") pod \"593c2553-a355-4578-b8e5-7a96e92a7fe8\" (UID: \"593c2553-a355-4578-b8e5-7a96e92a7fe8\") " Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.396185 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities" (OuterVolumeSpecName: "utilities") pod "593c2553-a355-4578-b8e5-7a96e92a7fe8" (UID: "593c2553-a355-4578-b8e5-7a96e92a7fe8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.396964 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8" (OuterVolumeSpecName: "kube-api-access-q7hz8") pod "593c2553-a355-4578-b8e5-7a96e92a7fe8" (UID: "593c2553-a355-4578-b8e5-7a96e92a7fe8"). InnerVolumeSpecName "kube-api-access-q7hz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.460606 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "593c2553-a355-4578-b8e5-7a96e92a7fe8" (UID: "593c2553-a355-4578-b8e5-7a96e92a7fe8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.493518 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7hz8\" (UniqueName: \"kubernetes.io/projected/593c2553-a355-4578-b8e5-7a96e92a7fe8-kube-api-access-q7hz8\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.493801 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.493889 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593c2553-a355-4578-b8e5-7a96e92a7fe8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.856921 4610 generic.go:334] "Generic (PLEG): container finished" podID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerID="d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389" exitCode=0 Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.857280 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerDied","Data":"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389"} Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.857312 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xscgx" event={"ID":"593c2553-a355-4578-b8e5-7a96e92a7fe8","Type":"ContainerDied","Data":"c49bf3a1acc4511c5972093f0d024c8d7db76ba82a7f4a9f2c8a04aba6a3e027"} Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.857333 4610 scope.go:117] "RemoveContainer" containerID="d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.857488 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xscgx" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.888303 4610 scope.go:117] "RemoveContainer" containerID="16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.936835 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.944010 4610 scope.go:117] "RemoveContainer" containerID="6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.954806 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xscgx"] Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.993842 4610 scope.go:117] "RemoveContainer" containerID="d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389" Oct 06 10:15:41 crc kubenswrapper[4610]: E1006 10:15:41.994702 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389\": container with ID starting with d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389 not found: ID does not exist" containerID="d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.994798 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389"} err="failed to get container status \"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389\": rpc error: code = NotFound desc = could not find container \"d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389\": container with ID starting with d30c865b8f810e5760df7d185f462a96d16815d3609a9e02771b99220ff49389 not found: ID does not exist" Oct 06 10:15:41 crc kubenswrapper[4610]: I1006 10:15:41.994881 4610 scope.go:117] "RemoveContainer" containerID="16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7" Oct 06 10:15:42 crc kubenswrapper[4610]: E1006 10:15:42.006013 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7\": container with ID starting with 16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7 not found: ID does not exist" containerID="16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7" Oct 06 10:15:42 crc kubenswrapper[4610]: I1006 10:15:42.006089 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7"} err="failed to get container status \"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7\": rpc error: code = NotFound desc = could not find container \"16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7\": container with ID starting with 16963d8135a36de6ed4dac745a8ad9ed3054de40e188647a34ddc30591cdfbe7 not found: ID does not exist" Oct 06 10:15:42 crc kubenswrapper[4610]: I1006 10:15:42.006124 4610 scope.go:117] "RemoveContainer" containerID="6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a" Oct 06 10:15:42 crc kubenswrapper[4610]: E1006 10:15:42.006420 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a\": container with ID starting with 6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a not found: ID does not exist" containerID="6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a" Oct 06 10:15:42 crc kubenswrapper[4610]: I1006 10:15:42.006459 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a"} err="failed to get container status \"6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a\": rpc error: code = NotFound desc = could not find container \"6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a\": container with ID starting with 6d203404db29d166bf66dbf0b64a7e93a64b17e0bde587106a5a08d45e9a963a not found: ID does not exist" Oct 06 10:15:43 crc kubenswrapper[4610]: I1006 10:15:43.080722 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" path="/var/lib/kubelet/pods/593c2553-a355-4578-b8e5-7a96e92a7fe8/volumes" Oct 06 10:15:46 crc kubenswrapper[4610]: I1006 10:15:46.469414 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:15:46 crc kubenswrapper[4610]: I1006 10:15:46.470892 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:15:46 crc kubenswrapper[4610]: I1006 10:15:46.936315 4610 generic.go:334] "Generic (PLEG): container finished" podID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerID="c08c46ae012c644ac9b007567b0e2b6e0f1c23a2043c50c7a4c327757dc6187d" exitCode=0 Oct 06 10:15:46 crc kubenswrapper[4610]: I1006 10:15:46.936358 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgpxl/must-gather-nmc77" event={"ID":"9548ab5b-bf5e-46b1-a8be-f1020de63b13","Type":"ContainerDied","Data":"c08c46ae012c644ac9b007567b0e2b6e0f1c23a2043c50c7a4c327757dc6187d"} Oct 06 10:15:46 crc kubenswrapper[4610]: I1006 10:15:46.936951 4610 scope.go:117] "RemoveContainer" containerID="c08c46ae012c644ac9b007567b0e2b6e0f1c23a2043c50c7a4c327757dc6187d" Oct 06 10:15:47 crc kubenswrapper[4610]: I1006 10:15:47.054448 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgpxl_must-gather-nmc77_9548ab5b-bf5e-46b1-a8be-f1020de63b13/gather/0.log" Oct 06 10:16:00 crc kubenswrapper[4610]: I1006 10:16:00.844248 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgpxl/must-gather-nmc77"] Oct 06 10:16:00 crc kubenswrapper[4610]: I1006 10:16:00.845100 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jgpxl/must-gather-nmc77" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="copy" containerID="cri-o://6b5faa4c49e9a9b2e20ead7754e3af8cb2067081f5e0deadddacf42b78a49207" gracePeriod=2 Oct 06 10:16:00 crc kubenswrapper[4610]: I1006 10:16:00.853199 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgpxl/must-gather-nmc77"] Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.076944 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgpxl_must-gather-nmc77_9548ab5b-bf5e-46b1-a8be-f1020de63b13/copy/0.log" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.079126 4610 generic.go:334] "Generic (PLEG): container finished" podID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerID="6b5faa4c49e9a9b2e20ead7754e3af8cb2067081f5e0deadddacf42b78a49207" exitCode=143 Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.355152 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgpxl_must-gather-nmc77_9548ab5b-bf5e-46b1-a8be-f1020de63b13/copy/0.log" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.355539 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.514780 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output\") pod \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.514965 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ffg\" (UniqueName: \"kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg\") pod \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\" (UID: \"9548ab5b-bf5e-46b1-a8be-f1020de63b13\") " Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.526406 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg" (OuterVolumeSpecName: "kube-api-access-c6ffg") pod "9548ab5b-bf5e-46b1-a8be-f1020de63b13" (UID: "9548ab5b-bf5e-46b1-a8be-f1020de63b13"). InnerVolumeSpecName "kube-api-access-c6ffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.622157 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ffg\" (UniqueName: \"kubernetes.io/projected/9548ab5b-bf5e-46b1-a8be-f1020de63b13-kube-api-access-c6ffg\") on node \"crc\" DevicePath \"\"" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.698264 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9548ab5b-bf5e-46b1-a8be-f1020de63b13" (UID: "9548ab5b-bf5e-46b1-a8be-f1020de63b13"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:16:01 crc kubenswrapper[4610]: I1006 10:16:01.724323 4610 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9548ab5b-bf5e-46b1-a8be-f1020de63b13-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 10:16:02 crc kubenswrapper[4610]: I1006 10:16:02.089524 4610 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgpxl_must-gather-nmc77_9548ab5b-bf5e-46b1-a8be-f1020de63b13/copy/0.log" Oct 06 10:16:02 crc kubenswrapper[4610]: I1006 10:16:02.091094 4610 scope.go:117] "RemoveContainer" containerID="6b5faa4c49e9a9b2e20ead7754e3af8cb2067081f5e0deadddacf42b78a49207" Oct 06 10:16:02 crc kubenswrapper[4610]: I1006 10:16:02.091105 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgpxl/must-gather-nmc77" Oct 06 10:16:02 crc kubenswrapper[4610]: I1006 10:16:02.121937 4610 scope.go:117] "RemoveContainer" containerID="c08c46ae012c644ac9b007567b0e2b6e0f1c23a2043c50c7a4c327757dc6187d" Oct 06 10:16:03 crc kubenswrapper[4610]: I1006 10:16:03.080332 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" path="/var/lib/kubelet/pods/9548ab5b-bf5e-46b1-a8be-f1020de63b13/volumes" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.190511 4610 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:05 crc kubenswrapper[4610]: E1006 10:16:05.190959 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="extract-content" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.190975 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="extract-content" Oct 06 10:16:05 crc kubenswrapper[4610]: E1006 10:16:05.191005 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="extract-utilities" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191014 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="extract-utilities" Oct 06 10:16:05 crc kubenswrapper[4610]: E1006 10:16:05.191071 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="copy" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191081 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="copy" Oct 06 10:16:05 crc kubenswrapper[4610]: E1006 10:16:05.191099 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="registry-server" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191107 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="registry-server" Oct 06 10:16:05 crc kubenswrapper[4610]: E1006 10:16:05.191138 4610 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="gather" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191146 4610 state_mem.go:107] "Deleted CPUSet assignment" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="gather" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191391 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="copy" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191421 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="9548ab5b-bf5e-46b1-a8be-f1020de63b13" containerName="gather" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.191434 4610 memory_manager.go:354] "RemoveStaleState removing state" podUID="593c2553-a355-4578-b8e5-7a96e92a7fe8" containerName="registry-server" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.203680 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.217701 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.405826 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.405914 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49sf\" (UniqueName: \"kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.405942 4610 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.507430 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.507510 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t49sf\" (UniqueName: \"kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.507538 4610 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.508113 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.508351 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.528478 4610 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49sf\" (UniqueName: \"kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf\") pod \"redhat-operators-qx74l\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:05 crc kubenswrapper[4610]: I1006 10:16:05.542578 4610 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:06 crc kubenswrapper[4610]: I1006 10:16:06.196866 4610 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:07 crc kubenswrapper[4610]: I1006 10:16:07.142840 4610 generic.go:334] "Generic (PLEG): container finished" podID="b39197f7-a162-44a3-b546-558a4bdd8a83" containerID="f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d" exitCode=0 Oct 06 10:16:07 crc kubenswrapper[4610]: I1006 10:16:07.142946 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerDied","Data":"f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d"} Oct 06 10:16:07 crc kubenswrapper[4610]: I1006 10:16:07.143162 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerStarted","Data":"b3ee9ac12cdab04b0432a222203951ac4f0882862f6c8cf82ef46bf26532c075"} Oct 06 10:16:08 crc kubenswrapper[4610]: I1006 10:16:08.164115 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerStarted","Data":"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b"} Oct 06 10:16:11 crc kubenswrapper[4610]: I1006 10:16:11.199117 4610 generic.go:334] "Generic (PLEG): container finished" podID="b39197f7-a162-44a3-b546-558a4bdd8a83" containerID="c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b" exitCode=0 Oct 06 10:16:11 crc kubenswrapper[4610]: I1006 10:16:11.199159 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerDied","Data":"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b"} Oct 06 10:16:12 crc kubenswrapper[4610]: I1006 10:16:12.210553 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerStarted","Data":"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe"} Oct 06 10:16:12 crc kubenswrapper[4610]: I1006 10:16:12.231123 4610 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qx74l" podStartSLOduration=2.716098775 podStartE2EDuration="7.231105071s" podCreationTimestamp="2025-10-06 10:16:05 +0000 UTC" firstStartedPulling="2025-10-06 10:16:07.145332512 +0000 UTC m=+5698.860385910" lastFinishedPulling="2025-10-06 10:16:11.660338798 +0000 UTC m=+5703.375392206" observedRunningTime="2025-10-06 10:16:12.225687439 +0000 UTC m=+5703.940740837" watchObservedRunningTime="2025-10-06 10:16:12.231105071 +0000 UTC m=+5703.946158459" Oct 06 10:16:15 crc kubenswrapper[4610]: I1006 10:16:15.543894 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:15 crc kubenswrapper[4610]: I1006 10:16:15.544206 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.471513 4610 patch_prober.go:28] interesting pod/machine-config-daemon-6w5xr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.471590 4610 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.471660 4610 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.473164 4610 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623"} pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.473253 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerName="machine-config-daemon" containerID="cri-o://7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" gracePeriod=600 Oct 06 10:16:16 crc kubenswrapper[4610]: E1006 10:16:16.591861 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:16:16 crc kubenswrapper[4610]: I1006 10:16:16.614975 4610 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qx74l" podUID="b39197f7-a162-44a3-b546-558a4bdd8a83" containerName="registry-server" probeResult="failure" output=< Oct 06 10:16:16 crc kubenswrapper[4610]: timeout: failed to connect service ":50051" within 1s Oct 06 10:16:16 crc kubenswrapper[4610]: > Oct 06 10:16:17 crc kubenswrapper[4610]: I1006 10:16:17.263473 4610 generic.go:334] "Generic (PLEG): container finished" podID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" exitCode=0 Oct 06 10:16:17 crc kubenswrapper[4610]: I1006 10:16:17.263534 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" event={"ID":"99a19d05-9838-4c7d-aa2c-e778a2ef0148","Type":"ContainerDied","Data":"7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623"} Oct 06 10:16:17 crc kubenswrapper[4610]: I1006 10:16:17.263587 4610 scope.go:117] "RemoveContainer" containerID="f4e47ed9c33d6aa6bfd0c2be196b34f713fb291d41b3eb5f6debbcc208ba308a" Oct 06 10:16:17 crc kubenswrapper[4610]: I1006 10:16:17.264467 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:16:17 crc kubenswrapper[4610]: E1006 10:16:17.264751 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:16:25 crc kubenswrapper[4610]: I1006 10:16:25.618707 4610 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:25 crc kubenswrapper[4610]: I1006 10:16:25.701347 4610 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:25 crc kubenswrapper[4610]: I1006 10:16:25.869198 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.369168 4610 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qx74l" podUID="b39197f7-a162-44a3-b546-558a4bdd8a83" containerName="registry-server" containerID="cri-o://f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe" gracePeriod=2 Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.838388 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.952054 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t49sf\" (UniqueName: \"kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf\") pod \"b39197f7-a162-44a3-b546-558a4bdd8a83\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.952614 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content\") pod \"b39197f7-a162-44a3-b546-558a4bdd8a83\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.952690 4610 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities\") pod \"b39197f7-a162-44a3-b546-558a4bdd8a83\" (UID: \"b39197f7-a162-44a3-b546-558a4bdd8a83\") " Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.954303 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities" (OuterVolumeSpecName: "utilities") pod "b39197f7-a162-44a3-b546-558a4bdd8a83" (UID: "b39197f7-a162-44a3-b546-558a4bdd8a83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:16:27 crc kubenswrapper[4610]: I1006 10:16:27.958783 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf" (OuterVolumeSpecName: "kube-api-access-t49sf") pod "b39197f7-a162-44a3-b546-558a4bdd8a83" (UID: "b39197f7-a162-44a3-b546-558a4bdd8a83"). InnerVolumeSpecName "kube-api-access-t49sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.039293 4610 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39197f7-a162-44a3-b546-558a4bdd8a83" (UID: "b39197f7-a162-44a3-b546-558a4bdd8a83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.055627 4610 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.055691 4610 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39197f7-a162-44a3-b546-558a4bdd8a83-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.055702 4610 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t49sf\" (UniqueName: \"kubernetes.io/projected/b39197f7-a162-44a3-b546-558a4bdd8a83-kube-api-access-t49sf\") on node \"crc\" DevicePath \"\"" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.384605 4610 generic.go:334] "Generic (PLEG): container finished" podID="b39197f7-a162-44a3-b546-558a4bdd8a83" containerID="f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe" exitCode=0 Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.384649 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerDied","Data":"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe"} Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.384678 4610 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qx74l" event={"ID":"b39197f7-a162-44a3-b546-558a4bdd8a83","Type":"ContainerDied","Data":"b3ee9ac12cdab04b0432a222203951ac4f0882862f6c8cf82ef46bf26532c075"} Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.384698 4610 scope.go:117] "RemoveContainer" containerID="f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.384708 4610 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qx74l" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.422626 4610 scope.go:117] "RemoveContainer" containerID="c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.440674 4610 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.449995 4610 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qx74l"] Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.456650 4610 scope.go:117] "RemoveContainer" containerID="f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.511008 4610 scope.go:117] "RemoveContainer" containerID="f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe" Oct 06 10:16:28 crc kubenswrapper[4610]: E1006 10:16:28.511959 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe\": container with ID starting with f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe not found: ID does not exist" containerID="f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.511991 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe"} err="failed to get container status \"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe\": rpc error: code = NotFound desc = could not find container \"f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe\": container with ID starting with f2ae60aa03fd57d8b260e50ad0193b4c6e7dab7a3e4b6ded11f66d57647d9efe not found: ID does not exist" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.512013 4610 scope.go:117] "RemoveContainer" containerID="c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b" Oct 06 10:16:28 crc kubenswrapper[4610]: E1006 10:16:28.512264 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b\": container with ID starting with c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b not found: ID does not exist" containerID="c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.512288 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b"} err="failed to get container status \"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b\": rpc error: code = NotFound desc = could not find container \"c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b\": container with ID starting with c36045d580e51a0c55a2312d7d99ee5ff4800b9821f67cd50fa759dae586627b not found: ID does not exist" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.512303 4610 scope.go:117] "RemoveContainer" containerID="f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d" Oct 06 10:16:28 crc kubenswrapper[4610]: E1006 10:16:28.512579 4610 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d\": container with ID starting with f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d not found: ID does not exist" containerID="f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d" Oct 06 10:16:28 crc kubenswrapper[4610]: I1006 10:16:28.512600 4610 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d"} err="failed to get container status \"f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d\": rpc error: code = NotFound desc = could not find container \"f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d\": container with ID starting with f71840c1a17a02bd7e382228894779f1e237b3d38e5930fb6767859123dd843d not found: ID does not exist" Oct 06 10:16:29 crc kubenswrapper[4610]: I1006 10:16:29.077164 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:16:29 crc kubenswrapper[4610]: E1006 10:16:29.078001 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:16:29 crc kubenswrapper[4610]: I1006 10:16:29.090774 4610 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39197f7-a162-44a3-b546-558a4bdd8a83" path="/var/lib/kubelet/pods/b39197f7-a162-44a3-b546-558a4bdd8a83/volumes" Oct 06 10:16:42 crc kubenswrapper[4610]: I1006 10:16:42.070518 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:16:42 crc kubenswrapper[4610]: E1006 10:16:42.072453 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:16:55 crc kubenswrapper[4610]: I1006 10:16:55.071071 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:16:55 crc kubenswrapper[4610]: E1006 10:16:55.071722 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:17:06 crc kubenswrapper[4610]: I1006 10:17:06.071975 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:17:06 crc kubenswrapper[4610]: E1006 10:17:06.072762 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148" Oct 06 10:17:17 crc kubenswrapper[4610]: I1006 10:17:17.071530 4610 scope.go:117] "RemoveContainer" containerID="7668c73e7aadddbe03e19f68a09db39702b2c5021fe28a4f1d48e9a5d2483623" Oct 06 10:17:17 crc kubenswrapper[4610]: E1006 10:17:17.072707 4610 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6w5xr_openshift-machine-config-operator(99a19d05-9838-4c7d-aa2c-e778a2ef0148)\"" pod="openshift-machine-config-operator/machine-config-daemon-6w5xr" podUID="99a19d05-9838-4c7d-aa2c-e778a2ef0148"